DataBank executives explore how AI is reshaping IT infrastructure and organizational responsibilities in a rapidly evolving landscape. COO Joe Minarik and CTO Vlad Friedman discuss the shift from hyperscaler reliance to private and co-located data centers as enterprises seek greater control over AI workloads.
Energy consumption emerges as a critical challenge, with data centers already consuming 4% of U.S. electricity and projections suggesting this could double or triple. AI workloads demand significant power and advanced cooling systems, straining energy grids particularly in densely populated regions. While renewable energy and small modular nuclear reactors offer future solutions, current deployments remain years away.
“AI isn’t a magic bullet—it’s a powerful tool that requires thoughtful application.”
— Vlad Friedman, CTO at DataBank
The conversation reveals how generative AI is moving model management from data science teams to core infrastructure teams, fundamentally changing IT responsibilities. DataBank’s 70 facilities across 27 U.S. regions provide the essential power and cooling infrastructure supporting these AI deployments.
Looking ahead, Friedman predicts hybrid model architectures where large language models use directional routing to access relevant sub-datasets, enhancing performance without expanding infrastructure requirements. Despite AI’s productivity gains of hundreds of percent, skilled oversight remains essential to prevent hallucinations and data exposure.
Share Article
Discover the DataBank Difference today:
Hybrid infrastructure solutions with boundless edge reach and a human touch.
Tell us about your infrastructure requirements and how to reach you, and one of team members will be in touch shortly.
"*" indicates required fields
Let us know which data center you'd like to visit and how to reach you, and one of team members will be in touch shortly.
"*" indicates required fields
"*" indicates required fields