Green computing: Why efficient IT infrastructures are becoming a competitive advantage in the age of AI
Generative AI is causing a steep increase in computing load worldwide – and hence also electricity consumption. According to the International Energy Agency (IEA), data centers consumed around 415 TWh of electricity in 2024. By 2030, the amount could more than double at just under 945 TWh. That turns energy efficiency into a strategic factor for all companies that want to offer or purchase AI services on a large scale.
How are stricter EU regulations changing the architecture and energy efficiency of modern data centers?
Alongside the technical growth, the regulatory framework is also being tightened. With the Energy Efficiency Directive and the Corporate Sustainability Reporting Directive (CSRD), the EU is increasing transparency and efficiency requirements for data centers. Operators must provide detailed data on energy consumption, emissions, and efficiency indicators. In Germany, the Energy Efficiency Act stipulates a maximum Power Usage Effectiveness (PUE) of 1.2 for new, large data centers from 2026, as well as a high proportion of renewable energies and the use of waste heat. That turns energy efficiency into an important criterion for approvals and site selection. Anyone wanting to build or expand capacity must demonstrate that electricity and resource consumption will remain manageable in the long term.
Despite the growing pressure, efficiency gains in many existing infrastructures remain limited. According to the Uptime Institute Global Data Center Survey, the average PUE has ranged from 1.55 to 1.6 since 2020, and was around 1.56 in 2024. The great leaps in efficiency of the 2000s and early 2010s have largely been realized. At the same time, IT power requirements have increased. GPU-based AI servers achieve power consumption in the kilowatt range, and power densities of 30 to 50 kW per rack are becoming increasingly common in new buildings. Conventional data center architectures are reaching their limits in both technical and economic terms. Additional efficiency potential can be tapped if the IT load, power supply, power electronics and cooling are regarded as a complete system.
What role do highly efficient power supply units, SiC/GaN power electronics, and 48 V architectures play in modern data centers?
This puts the focus on power supply units, power electronics, voltage transformers, storage and cooling systems, and sensor technology. Companies that align their infrastructure to performance per watt make better use of limited space, reduce the load on network connections, and remain competitive even in regions where available power is becoming a bottleneck.
Highly efficient power supply units with high efficiency levels reduce losses in the power supply. Power electronics based on SiC and GaN components allow higher switching frequencies and lower losses, while 48 V architectures and low-loss DC distribution units reduce conversion stages. High-current connectors and suitable PCB designs determine how reliably high currents are routed in the rack.
How do liquid cooling, sensor technology and AI-supported control improve data center efficiency?
While air-based cooling reaches its efficiency limits in very dense racks, direct-to-chip and immersion cooling are becoming more important. Liquid-based systems can improve PUE and reduce direct water consumption, especially compared to evaporative cooling systems. In regions where water is scarce, that makes the choice of cooling system a decisive factor in increasing acceptance.
Sensors, instrument transformers and telemetry ICs provide the database for identifying efficiency potential in operations. AI processes can help manage data centers more efficiently, for example, through load forecasts, adaptive cooling, or optimized workload distribution. Components that provide accurate measurements contribute directly to improving efficiency.
How will green computing become an economic success factor for digital infrastructures?
Green computing is thus becoming a business-relevant management tool. An energy-efficient infrastructure reduces operating costs, improves ESG metrics, and improves the chances of success in tenders, especially where customers demand concrete carbon reduction planning and reliable energy data. Investors prefer projects with predictable energy requirements and a clear decarbonization strategy.
Component manufacturers can position themselves as AI economy enablers if they quantify efficiency gains transparently and demonstrate them over the life cycle. System integrators are bundling efficient components into complete integrated solutions and providing operators with concepts for high power densities. It is crucial that components not only provide performance data but also make their contribution to metrics such as PUE, OPEX and carbon footprint comprehensible. Green computing will then become a measurable part of the overall profitability of digital services.
What does green computing mean for the electronics industry?
Companies that anchor green computing as a core competence treat energy and resource efficiency as a key criterion when selecting components, power supply and cooling, through to the use of waste heat. That creates new opportunities for the electronics industry. Anyone who can provide power supply units, circuit boards, and memory or cooling systems with demonstrably better performance per watt and per unit area will improve their position in ESG-driven tenders. Efficient solutions can be developed and rolled out more quickly in the data center when component manufacturers, system integrators, and operators work together at an early stage. That makes energy-efficient IT infrastructures the basis for reliable growth in digital business and a real competitive advantage in the age of AI.