Data Management is a top challenge in the AI revolution

Photo by Tara Winstead

According to a global study conducted by S&P Global Market Intelligence and commissioned by WEKA, the adoption of artificial intelligence (AI) by enterprises and research organizations seeking to create new value propositions is accelerating, but data infrastructure and AI sustainability challenges present barriers to implementing it successfully at scale.

These challenges have been exacerbated by the rapid onset of generative AI that has defined the evolution of the AI market in 2023.

These findings were published as part of S&P Global’s new 2023 Global Trends in AI report. The research findings are based on a sweeping global survey conducted by S&P Global of more than 1,500 AI practitioners and decision-makers at medium to large enterprise and research organizations across APAC, EMEA and North America – one of the largest of its kind to date.

The study identifies the opportunities and obstacles organizations have encountered in their AI journeys, the unique motivators and value drivers spurring global AI adoption across industries, and provides insights into what steps organizations will need to take to succeed with AI in the future. 

“The meteoric rise of data and performance-intensive workloads like generative AI is forcing a complete rethink of how data is stored, managed and processed. Organizations everywhere now have to build and scale their data architectures with this in mind over the long term,” said Nick Patience, senior research analyst at 451 Research, part of S&P Global Market Intelligence.

“Although it is still the early days of the AI revolution, one of the overarching takeaways from our 2023 Global Trends in AI study is that data infrastructure will be a deciding factor in which organizations emerge as AI leaders. 

“Having a modern data stack that efficiently and sustainably supports AI workloads and hybrid cloud deployments is critical to achieving enterprise scale and value creation.”

Key findings from the study include:

AI Adoption and Use Cases Are Accelerating, But Enterprise-Scale Still Remains Elusive

  • 69% of survey respondents reported having at least one AI project in production.
  • Only 28% say they have reached enterprise scale, with AI projects being widely implemented and driving significant business value.
  • AI has shifted from simply being a cost-saving lever to a revenue driver, with 69% of respondents now using AI/ML to create new revenue streams.

Data Management Is the Top Technical Inhibitor to AI Adoption

  • The most frequently cited technological inhibitor to AI/ML deployments is data management (32%), outweighing challenges for security (26%) and compute performance (20%), evidence that many organizations’ current data architectures are unfit to support the AI revolution.

Enterprise AI Use Cases Are Shifting From Cost-Savings to Topline Growth

  • 69% of respondents cited that their AI/ML projects focus on developing new revenue drivers and value creation versus 31% still being cost reduction-focused.

As AI Initiatives Mature, a Hybrid Approach and Multiple Deployment Locations Are Needed to Support Workload Demands

  • AI/ML workloads are being deployed in a variety of locations, from the public cloud to enterprise data centers and, increasingly, edge sites. Respondents running AI in production leverage more deployment locations on average (3.2 for training, 2.5 for inference) than those in pilots and proof-of-concept phases (2.9, 2.3 ).
  • The public cloud is the primary deployment location for training AI/ML models (47%) and inferencing (44%).
  • Those who leverage the public cloud to run AI/ML are more likely to leverage a hybrid approach incorporating more locations for both training (4.2, on average) and inference (3.2), as opposed to those who do not use the public cloud (2.2, 1.9).

AI’s Energy and Carbon Footprint Are Straining Corporate Sustainability Goals, But The Cloud Presents a Path to Improvement

  • 68% of respondents indicated they were concerned with the impact AI/ML had on their organization’s energy use and carbon footprint
  • 74% of respondents said sustainability is an important or critical motivator for moving more workloads to the public cloud.

Aging Data Infrastructures and Legacy Architectures Directly Impact AI’s Sustainability Performance 

  • 77% of respondents said their data architectures directly impact their sustainability performance.  

Organizations Must Get Their Data and Infrastructure ‘Houses in Order’ to Lead with AI

  • Companies leveraging a modern data architecture to overcome significant data challenges (sources, types, requirements etc.) can accommodate AI workloads operating across multiple infrastructure venues.

“This expansive study from S&P Global validates what WEKA has heard repeatedly from our customers: traditional data infrastructures are having a direct, negative impact on their ability to use AI efficiently and sustainably at scale because they weren’t developed with modern performance-intensive workloads or hybrid cloud and edge modalities in mind,” said Liran Zvibel, cofounder and CEO at WEKA.

“Just as you wouldn’t expect to use battery technologies developed in the 1990s to power a state-of-the-art electric vehicle, like a Tesla, you can’t expect data management approaches designed for last century’s data challenges to support next-generation applications like generative AI.

Organizations that build a modern data stack designed to support the needs of AI workloads that seamlessly span from edge to core to cloud will emerge as the leaders and disruptors of the future.”