It’s widely known that government and academic researchers have relied on high-performance computing (HPC) for decades to solve scientific and engineering problems needing “big compute” and/or “big data” capabilities.

It’s no secret, either, that industrial firms of all sizes have adopted HPC to speed up the development of products ranging from cars and planes to golf clubs and potato chips. But lately, something new is happening. Leading commercial companies in various sectors have begun turning to HPC technologies for mission-critical, big data analytics workloads that today’s enterprise IT technology simple can’t handle well enough on its own.

This isn’t totally surprising when you realise that some of the key technologies underpinning business analytics and business intelligence originated in the world of HPC. But why exactly are a growing number of enterprises turning to HPC for their big data analytics needs? Who were the pioneers of this trend? And what does the future hold in terms of more widespread use?

The primary factors driving businesses to adopt HPC for big data analytics purposes fall into three categories. The first of these is ‘high complexity’ because, simply put, HPC technology enables companies to fire more complex, intelligent questions at their data infrastructures. This can provide important advantages in today’s increasingly competitive markets, and is especially useful when there is a need to go beyond query-driven searches in order to discover unknown patterns and relationships in data — such as for fraud detection, to reveal hidden commonalities within millions of archived medical records, or to track buying behaviours through wide networks of relatives and acquaintances.

The second key driver is ‘high time criticality’ because information that is not available quickly enough may be of little value. The weather report for tomorrow is useless if it’s not available until next week, just as fraud detection is not much use if it comes after the consumer has already been charged. The move to high-performance data analysis (HPDA) speeds up this process, and for financial services companies engaged in high-frequency trading, HPC technology enables proprietary algorithms to exploit market movements in minute fractions of a second, before the opportunities disappear.

The final key motivator for the move to HPDA is ‘high variability’. People generally assume that big data is ‘deep’, meaning that it involves large amounts of data. They recognise less often that it may also be ‘wide’, meaning that it can include many variables. For example, a ‘deep’ query might request a prioritised listing of last quarter’s 500 top customers in Europe; whereas a ‘wide’ query might go on to analyse their buying preferences and behaviours in relation to dozens of criteria. An even ‘wider’ analysis might employ graph analytics to identify any fraudulent behaviour within the customer base. HPC enables all of this and more.

Looking back through history, the financial services industry was the first commercial market to adopt supercomputers for advanced data analytics. In the 1980s, large investment banks began hiring particle physicists from Los Alamos National Laboratory and the Santa Fe Institute to employ HPC systems for daunting analytics tasks such as optimising portfolios of mortgage-backed securities, pricing exotic financial instruments, and managing firm-wide global risk.

In more modern times, we’ve seen PayPal adopt HPC hardware systems to perform sophisticated fraud detection on eBay and StubHub transactions in real time, before fraud hits consumers’ credit cards. Under the previous enterprise technology regime, this process had taken up to two weeks, and IDC estimates that using HPC to detect fraud has already saved PayPal more than $700 million and has also enabled the company to perform predictive fraud analysis. Following this success, PayPal is extending HPC use to affinity marketing and management of the company’s general IT infrastructure.

Looking ahead, the key use cases for HPDA will likely revolve around fraud and anomaly detection, marketing, and business intelligence. PayPal has already pointed the way in regards to the first of these, and we can expect more businesses to follow suit by employing various high-performance analytics techniques in a bid to identify harmful or potentially harmful data patterns in real time.

In terms of marketing, I expect more organisations to start using complex HPC-enabled algorithms to discern potential customers’ demographics, buying preferences, and habits, while in relation to business intelligence, the implementation of HPC will increasingly help businesses to better understand themselves, their competitors, and the evolving dynamics of the markets they participate in.

While the potential use cases are clear, many enterprises continue to be put off by the perceived high costs of taking this leap into the future. However, HPC technology is more affordable and accessible than some might think, with clusters now starting at under $100,000.

The vendor community has gained substantial experience helping first-time users make the most of HPC, and more than 100,000 systems are now sold around the world each year, a figure that is certain to rise as more and more enterprises wake up to the fact that in order to stay competitive they will need to out compute their rivals.

The columnist is group vice president and regional managing director for the Middle East, Africa and Turkey at global ICT market intelligence and advisory firm International Data Corporation (IDC) He can be contacted via Twitter @JyotiIDC.