Many technology companies are adopting Artificial Intelligence (AI) to enhance efficiency and streamline their processes for sustainable business growth. By implementing smart data analytics, predictive maintenance, digital twin technologies, and anomaly detection within their energy management systems, AI enables organizations to save both time and costs. As a result, AI has become a top priority and a key investment area for corporations on their digital transformation journey. However, innovation comes at a cost, as AI can be a significant contributor to carbon emissions. This raises a question for technology companies: do the benefits of AI truly outweigh its environmental impact?
Registered users can unlock up to five pieces of premium content each month.
Log in or register to unlock this Insight.
AI's Hidden Emissions and the Lack of Corporate Transparency
|
NEWS
|
Google’s 2024 sustainability report reveals a 48% increase in Greenhouse Gas (GHG) emissions since 2019 and a 13% increase Year-over-Year (YoY), primarily driven by the increase in energy consumption at their data centers. Similarly, Microsoft’s 2024 sustainability report shows a 29% growth in emissions since 2020, attributed to the expansion of data centers to support Artificial Intelligence (AI) workloads. Microsoft acknowledged that the infrastructure and electricity required for supporting AI’s growth is creating additional hurdles in achieving sustainability goals within the tech industry. Cornell University scientists discovered that training Large Language Models (LLMs), like GPT-3, consumed an amount of electricity equivalent to 500 metric tons of carbon. Moreover, these models are continuously trained on newer data, adding to their carbon footprint. They also noted that the companies accountable for these models are often hesitant to be transparent about the processes involved in training and retraining their LLMs.
Assessing the Environmental Impact of AI across Its Lifecycle
|
IMPACT
|
There are various stages in the lifecycle of AI that contribute to its negative environmental impact. The following phases in the AI lifecycle are particularly relevant to carbon emissions:
- Data Center Energy Consumption: As AI continues to advance and become more sophisticated with Machine Learning (ML) and LLM technologies, the energy needed to train and operate these models grows significantly, driving an increase in data center capacity and workloads, which, in turn, leads to higher electricity and power demands. Much of this increased energy consumption is still sourced from non-renewable energy such as fossil fuels (coal, oil, and natural gas). The energy and electricity required for the cooling and maintenance of data centers contributes to the overall emissions as well.
- Training Models and AI Inference: Training models demands extensive processing power over long periods. Moreover, models are continuously retrained to incorporate the newest data, amplifying energy consumption and emissions. Beyond training, running AI models for real-time applications is energy intensive and generates ever increasing energy demands.
- Network Infrastructure: Transmitting data to and from AI models requires robust network infrastructure where it must move through various layers of the Internet, including fiber-optic cables, routers, and switches, requiring high-speed and higher capacity networks. As AI applications are progressively integrated into Internet of Things (IoT) devices that constantly send and receive data from central servers over wireless networks, the infrastructure supporting IoT—cellular networks, 5G towers, and Wi-Fi routers—must handle increasing amounts of data, further increasing the carbon footprint.
Navigating the Duality: Finding the Balance between Innovation and Impact
|
RECOMMENDATIONS
|
The use of AI has both environmental costs and benefits, extending across multiple layers that include direct and indirect emissions, as well as electricity consumption. However, any analysis of AI also cannot overlook the significant benefits that AI brings in optimizing energy efficiency through its smart analytics and management capabilities. There must be a balance between leveraging these advantages and addressing the environmental costs associated with AI’s own energy consumption. Below, ABI Research recommends a few ways ecosystem players can support AI technology growth while reducing carbon emissions:
- Adopting Greener Practices: The energy consumption of data centers is growing, so companies can look to transition to renewable energy sources for data centers and implement energy-efficient hardware (see ABI Research’s Building Greener Data Centers to Minimize AI’s Carbon Footprint presentation (PT-3219)). Green data centers are on the rise, with companies like Google Cloud, Intel, and Siemens & Greenergy at the forefront of data center innovation.
- Choose Cloud Servers over Local Servers for Non-Critical AI Workloads: Cloud servers are more environmentally friendly because there is no need to maintain traditional hardware and cooling systems that require high amounts of electricity. Virtualization also facilitates server consolidation by combining multiple servers onto a single physical host, optimizing the use of idle hardware resources and reducing energy waste. Additionally, traditional data centers are usually powered with energy from fossil fuel sources, but cloud services are increasingly tapping into renewables (solar, wind, and hydroelectric power).
- Sustainable AI Development: The development and deployment of AI should prioritize sustainable algorithms and models that require less computational power. Research has shown that AI models that are trained on modern and more efficient hardware that produce less carbon emissions compared to older ones. Similar methodologies can be implemented for the AI training using smaller datasets or less resource-intensive algorithms.
- Regulation and Transparency in AI’s Emissions: Standards should be set for corporations and businesses, especially tech companies that are heavily investing in AI technologies, to report the carbon emissions across the lifecycle of AI, including but not limited to training models and data centers’ emissions. This practice helps build transparency and accountability of companies’ sustainability goals and increasing awareness of the AI impact for better management in the future. Governments and regulatory bodies play a critical role in ensuring the sustainable development and usage of AI. Introducing regulations on AI emissions reporting and offering incentives for the use of renewable energy in AI applications can drive the technology sector toward more sustainable practices.
- Cross-Collaboration: Achieving balance will require collaboration between AI developers, technology corporations, policymakers, governments, and international organizations. The entire ecosystem must work together to address both the environmental benefits and costs of AI to harness its full potential, while minimizing its negative environmental impact. For example, ASEAN has released a guide on “AI Governance and Ethics” that focuses on data governance, transparency, and accountability in AI usage. Similar guides could be developed regarding the sustainable usage of AI.