AI is rapidly transforming data centers, as the massive computational workloads required to support generative AI, autonomous systems, and numerous other advanced technologies are pressing current facilities to their limits. By 2030, data centers are expected to reach 35 gigawatts of power consumption annually, up from 17 gigawatts in 2022, according to management consulting firm McKinsey & Company.Â
AI is fundamentally reshaping the data center landscape, not just in scale but also in purpose, says Vivian Lee, a managing director and partner with Boston Consulting Group. “What used to be infrastructure built to support enterprise IT is now being retooled to meet the massive and growing demands of AI, particularly large language models,” she notes in an email interview.Â
Rapid GrowthÂ
AI is driving major changes in how data centers are designed and built, especially in terms of density, says Graham Merriman, leader of Rogers-O’Brien Construction’s data center projects. “We’re seeing more computing and more power packed into tighter footprints,” he observes in an online discussion. “That shift is also reshaping the supporting infrastructure, particularly cooling.”Â
AI is accelerating data center industry growth beyond any previous market expectations, says Gordon Bell, a principal at professional services firm Ernst & Young. “This dynamic not only results in higher power, capital, and resource requirements to develop new data centers, but it also changes the ways large data center users approach lease versus buy, market selection, and data center design decisions,” he explains in an online interview. “The need to train large frontier models has driven significant increases in aggregate data center demand, as well as the size of individual hyperscale data center campuses.” Â
Operational ImpactÂ
Bell points out that AI runs on graphics processing units (GPUs), which are more power-consumptive than traditional central processing nits (CPUs). This shift requires more power, as well as more cooling throughout the data center, he notes. “Traditionally, data centers were air-cooled, but the market is shifting toward liquid-cooling technologies given the increased power density of AI workloads.”Â
AI won’t increase data center staff size, but it will change the maintenance playbook, Merriman says. “With advanced cooling systems comes more specialized maintenance requirements,” he explains. “The industry is also adjusting to new protocols around liquid cooling and environmental controls that are more sensitive to performance fluctuations.”Â
Traditional data centers will face significant challenges in adapting to AI-powered operations and supporting AI-driven workloads, predicts Steve Carlini, chief data center and AI advocate at digital automation and energy management firm Schneider Electric. “Many legacy facilities weren’t designed to support the high-power densities and cooling requirements needed for AI applications,” he observes in an email interview. Carlini notes that modernization efforts — such as upgrading the electrical infrastructure, deploying liquid cooling, and enhancing energy efficiency — while costly, can extend the lifespan of older data centers. “Those unable to adapt may struggle to remain viable in a rapidly evolving, AI-dominated landscape.”Â
Operations are also being challenged by supply chain constraints, Lee says. “Critical components like transformers, cooling systems, and backup generators now have lead times measured in years rather than months,” she explains. “In response, operators are shifting to bulk procurement strategies and centralized logistics to keep project timelines on track.”Â
Cost ImpactÂ
AI workloads require significantly more electricity, so operating costs will go up, Merriman says. “To manage these challenges, facilities are moving toward closed-loop cooling systems that help reduce water usage and improve thermal efficiency.”Â
While investing in AI-capable data centers will be costly, it also has the potential to significantly reduce operating expenses, says David Hunt, senior director of development operations at credit reporting firm TransUnion. “AI optimizes energy consumption, reduces cooling expenses, and minimizes the need for manual intervention, leading to lower operational costs,” he observes in an online interview. “However, the increased power demand for AI workloads can also drive-up energy costs.”Â
Carlini notes that AI-driven workloads are expected to more than triple by 2030. “Strategic investments in AI-ready infrastructure, energy efficiency, and collaboration between industry leaders and policymakers will be essential for building a resilient, high-performance data center ecosystem capable of supporting AI’s continued growth.”Â
Final ThoughtsÂ
AI will continue driving record-setting levels of data center development over the next several years, Bell predicts. “At the same time, GPU manufacturers have announced product roadmaps that include even more power-hungry chips,” he says. “These dynamics will continue to shape industry growth.”Â
Integrating AI into data centers isn’t just technology, it’s also about strategic planning and investment, Hunt says. “Organizations need to consider the long-term benefits and challenges of AI adoption, including the environmental impact and the need for skilled personnel to manage these advanced systems consistent with internal governance requirements,” he states. “Collaboration between AI developers, data center operators, and policymakers will be crucial in shaping the future of data centers.”Â