The datacenter represents the beating heart of the modern business. It is where information flows in and out of the organisation, ultimately helping to support business functions, create competitive advantage, and drive revenue. And with the emergence of various new trends and innovations, the datacenter has been going through a continuous process of transformation over the past few years.

IT applications and services are critical elements that represent the lifeblood that flows through the datacenter to support the way in which organisations interact with their customers, deliver new products and services, and improve the productivity of their employees. Datacenters are continually being tasked with delivering workloads in the most cost-effective and efficient manner, and organisations must now look to drive greater efficiencies in their operations to provide the agility required to meet future business demand.

So what needs to happen? Well, the traditional manner of provisioning services via hardware and software needs to radically change in order to keep pace with the digital transformation that is rapidly shaping today’s business landscape. Past technologies have introduced increased levels of optimisation within the datacenter, but is this enough? Most corporate datacenters were built on the assumption that technology enhancements and workloads occur at a steady and predictable pace, but that is not the rate of change we are currently experiencing.

Managing expectations

IDC believes that CIOs must also consider whether their current technology platforms are capable of supporting the agility and scale expectations that businesses now demand. And how can the IT department manage these expectations while simultaneously facilitating greater levels of innovation? These are just some of the challenges facing IT executives today. And in order to address these challenges, the datacenter needs to evolve and keep pace with the dynamic trends that are shaping the external market.

New technologies should be implemented not only to reduce complexity and costs, but also to redefine the way in which the IT department provisions services for consumption. The concept of next-generation datacenters has been around for a while, where initially the main tasks were to rationalise, consolidate, and standardise the IT environments. While many organisations are progressing along this path, the next wave of change beyond virtualisation needs to address the provisioning of IT as a service (ITaaS) by moving away from the current silo-based approach.

With each passing year, it appears that the competitive environment requires businesses to deliver applications faster and improve productivity, whilst simultaneously reducing costs. The costs associated with provisioning, monitoring, and managing servers have escalated, challenging organisations to seek out new systems and tools that can help them lower the overall cost of it operations, including — among others — converged systems, hyperscale cloud infrastructure, software-defined computing, OpenStack, modularity, and virtualisation.

Internal and external customers

We currently find ourselves in the midst of widespread digital transformation, with the endgame being the conversion of all business and IT operations into digital processes. This technology evolution is already placing intense demands on CIOs and the IT departments they manage. And adding to this pressure is the way in which end-user expectations for IT services are evolving; internal and external customers alike now expect to be able to access applications and data at anytime from anywhere and on any device.

What all this means is that the performance of business operations is now intrinsically linked to the datacenter and its accompanying server infrastructure. With this in mind, next-generation datacenters are focusing on the services they deliver and how they are delivered. And for all this to be effective, CIOs must develop a comprehensive understanding of how individual business units operate, as well as a clear picture of the end user’s current and future needs. Indeed, IDC believes that such knowledge of individual workload characteristics will play a critical role in deciding which server technologies to adopt, maintain, and/or retire.

Given how essential IT applications and services are to the success of any modern business, the primary goal must now be to maximise application performance and efficiency. Some applications, such as large databases, will run best on a converged system that integrates hardware with software, while other applications will be better to run in a hosted service provider environment. It is ultimately the CIO’s responsibility to sift through these options and weigh the pros against the cons to make the right decision.

Modern infrastructure

But to help you along the way, IDC advises customers that are deploying next-generation applications with distinctly different operational characteristics to explore containers, software-defined computing, and disaggregated and compostable systems. On the other hand, customers with more traditional enterprise application needs may be better served by avoiding disaggregated and compostable systems, as those systems are not yet optimised for these workloads.

As businesses battle to maintain a modern infrastructure and keep their business goals in context, there is often a temptation to chase the latest innovation waves. But without a clear match between innovative new technologies and a useful business outcome, new technology for new technology’s sake is never a good choice to make. That said, hot new emerging technologies such as containers, OpenStack, and software-defined computing are unlocking new opportunities for IT that can deliver tangible benefits — to the right application.

The columnist is group vice-president and regional managing director for the Middle East, Africa and Turkey at global ICT market intelligence and advisory firm International Data Corporation (IDC) He can be contacted via Twitter @JyotiIDC