The data and insights featured below come from DataBank’s latest research report, “Accelerating AI: Navigating the Future of Enterprise Infrastructure,” which focused on enterprise AI adoption, ROI, and infrastructure challenges.
In our previous articles, we explored how enterprises are achieving ROI from AI and what’s blocking those that haven’t gotten there yet. Now, in this third article in the series, we’ll examine where companies are actually running their AI workloads – and how that’s about to change dramatically.
The infrastructure choices organizations make for AI workloads have significant implications for performance, cost, security, and compliance. Our survey reveals that while cloud remains important, a major shift toward hybrid and distributed infrastructure is already underway.
Currently, nearly two-thirds (64%) of AI workloads run in either public cloud (49%) or private cloud (15%) environments. Another 15% favor third-party SaaS/web-based platforms like Salesforce.
On-premises and colocation data centers represent a much smaller portion today – just 22% of AI workloads according to our survey (15% on-premises/company-controlled and 7% colocation data centers).
This makes sense. Public cloud and SaaS solutions offer easy starting points for AI adoption, with minimal upfront investment and quick deployment times. However, as AI implementations mature, the limitations of a cloud-only approach become apparent.
Looking ahead five years, 96% of respondents expect their AI infrastructure distribution to change. Only 4% report “no significant changes planned.”
Over half of respondents are planning substantial expansions in physical infrastructure:
Meanwhile, 43% still expect to increase their reliance on cloud for AI workloads, confirming that the future isn’t about choosing between cloud and physical infrastructure. It’s about strategically combining both.
When we asked which factors are most critical in choosing between cloud and colocation for AI workloads, three priorities emerged:
As Philips’ Chief Innovation Officer Shez Partovi noted in the research process: “In a de-globalized world, there is an increasing need to ensure that data is housed and processed in compliance with the specific country or jurisdiction, which is leading to a more decentralized approach.”
Perhaps, the most striking finding in our survey is the extent to which AI is driving geographic expansion of infrastructure. Over three-quarters of respondents (76%) expect their infrastructure to expand geographically over the next five years:
Only 11% anticipate consolidation into fewer, but larger, data hubs, while just 13% expect no significant impact on geographic distribution.
This geographic dispersion addresses multiple needs simultaneously. For compliance, it enables organizations to meet data sovereignty requirements by storing and processing data within specific countries or jurisdictions. For performance, it reduces latency for real-time AI applications like autonomous vehicles, smart city sensors, and industrial IoT.
“We operate in every corner of the world and there are obviously sovereign data requirements,” said Chris Bedi, ServiceNow’s Chief Customer Officer. “To make sure we are serving customers in each region, we need to have a geographically dispersed data center strategy.”
Interestingly, while inference is becoming more distributed, AI training is moving in the opposite direction. Nearly three-quarters of respondents (73%) said that “training will be centralized while inference will be more distributed.” This split makes technical sense.
Training requires massive computational power and benefits from centralized, GPU-rich environments. Inference, by contrast, needs to happen close to users and data sources to minimize latency and ensure compliance with local regulations.
Expanding AI infrastructure across multiple regions isn’t without challenges. When we asked about major obstacles to geographic scaling, respondents cited:
According to DataBank CEO Raul Martynek, infrastructure readiness varies significantly by location. “It’s not just access to GPUs that matters. You also need somewhere to put them. Not all data centers can accommodate the power and cooling requirements of these infrastructures, and new facilities take time to deploy. That’s why we developed a Universal Data Hall Design for all our new data center builds. It allows us to configure each data hall more quickly in a new facility for whatever infrastructure need that market might have – from traditional air-cooled configurations capable of 15-30kW per cabinet, to liquid-cooled systems capable of 100-200kW per cabinet.”
The future of AI infrastructure isn’t an either/or proposition between cloud and physical data centers. It’s a strategic hybrid approach that places workloads where they perform best, meet compliance requirements, and deliver optimal cost-efficiency.
Less sensitive workloads leveraging public datasets can reside in public cloud. More sensitive applications requiring stringent security, compliance, or low-latency performance are increasingly deployed in colocation or private data centers—and those deployments are spreading geographically to be closer to the data and users that need them.
In our next article, we’ll explore how AI strategies are improving with approaches such as blending off-the-shelf applications, custom solutions, and tailored deployment models.
This is the third in a five-part series examining key trends in enterprise AI adoption based on our 2025 AI infrastructure survey. You can read our previous posts on AI ROI findings and what’s blocking AI success, or download the full report, “Accelerating AI: Navigating the Future of Enterprise Infrastructure.”
Discover the DataBank Difference today:
Hybrid infrastructure solutions with boundless edge reach and a human touch.
Tell us about your infrastructure requirements and how to reach you, and one of team members will be in touch shortly.
"*" indicates required fields
Let us know which data center you'd like to visit and how to reach you, and one of team members will be in touch shortly.
"*" indicates required fields