1.1697506-3513796994
Dubai Land Department needs to have the final word on all real estate data and its interpretation. Image Credit: Gulf News Archives

At a recent European Commission sponsored real estate conference in Luxemburg, statisticians demonstrated that there was a difference of about six months in the lead-lag timeframe when indices are constructed using the hedonic model, as compared to the commonly used “Repeat Sales Method”.

The data covered a period of three decades from Japan to Poland, in a study encompassing 26 markets. The conclusions were of little surprise to most members in the audience, and data providers as well as banks are increasingly turning towards the more rigorous hedonic methodology in most developed markets.

In Dubai, data providers have not discussed this issue at length. Despite (or perhaps because of) data streams indicating a variety of communities starting to see a rise in prices as early as March, indices are only now beginning to show prices starting to stabilize and, in cases, even rising.

A compilation issue

The problem has always been in the compilation of big data, and the repeat sales methodology historically has failed to account for improvement in build quality of the unit over time. It has also failed to take into account the velocity of transactions, and used listings as proxies, which in the web based world has been unnecessarily duplicated and “clustered” for extreme value transactions that skew market cycles, without any authenticity.

This has been the main reason why there has been a greater than 30 per cent variance in price performance measurement between major data providers this year (this is not a new phenomena). A similar comedy of errors played out in 2008-10, when no one seemed to agree on how much prices had fallen. One major investment bank stated that prices had not only fallen by more than 80 per cent, but had another 50 per cent further to go! All of this has contributed to the contagious cynicism narrative that has permeated the ecosystem for much of 2019.

This was an essential part of the Luxemburg conference, where contributors addressed index methodology and its impact not only on sentiment, valuation and pricing, but also in decision making by institutions where market based indicators were at significant variance from what was transpiring on the ground.

Weed out the noise

The issues to dissect are manifold. First and foremost, in the age of big data, there is the possibility of significant manipulation. This transpires at the agent level, all the way to the data providers and developers.

The noise that clutters up the system adds to the angst that is then expressed in social media forums. Data integrity is of the essence, and the only way this can be ensured is through active government oversight, such that data providers are not getting away with fancy techniques that add up to essentially nothing more than jargon.

Big data implies by definition that the data needs to be “cleansed” and that the data sets that are being generated by private sector providers can be complemented by the government sources to enhance the overall level of decision making, not to contaminate the process. What do we mean by this?

When we look at data streams, there should not be errors of double counting, nor should there be gaps filled in by advertising data, nor should there be no adjustment mechanism for build quality enhancements over time. Granularity of data streams in the ultimate analysis in a market like Dubai rests with the Land Department and it is this authority that is the final repository of data streams, including inventory, supply, transactions and price action.

Filtered

This was the overarching conclusion at the conference, where countries both developing and developed gave their conclusions of how data streams improved significantly once they had been through the oversight of government bodies. This may not be completely feasible in all cases, but when it comes to real estate in Dubai the pioneering role played by the Land Department and RERA both in terms of regulation and transparency should now extend to the realms and streams of big data. And where the debate of price volatility, demand dynamics and supply pipeline can be resolved with a margin of error that is statistically palatable.

Real estate data measurement has always been more difficult than that of commoditized products and services, such as equity capital markets. However, being more challenging does not imply that the task be avoided. Or worse, not be overseen, thereby leading to a wide dispersion of data streams, none of which have the same underlying methodology or the same series of assumptions.

In Dubai, the stage is set for real estate data platforms to catapult themselves ahead of most developed markets within a very short period of time, given the central repository that already exists. As was concluded at Luxemburg, the way forward is not only increased interaction with other cities and data providers, but also to make these interactions public, such that data providers and analytics can rise to a common standard.

Sameer Lakhani is Managing Director at Global Capital Partners.