In the realm of artificial intelligence (AI), discussions often revolve around misinformation and the potential threat to human jobs. However, a Boston University professor, Kate Saenko, is drawing attention to another significant concern—the substantial environmental impact of generative AI tools.
As an AI researcher, Saenko raises concerns about the energy costs of constructing AI models. In an article on The Conversation, she emphasizes, “The more powerful the AI, the more energy it takes.”
While the energy consumption of cryptocurrencies like Bitcoin and Ethereum has garnered extensive debate, the rapid development of AI has not received the same level of scrutiny in terms of its impact on the planet.
Professor Saenko aims to change this narrative, acknowledging the limited data available on the carbon footprint of a single generative AI query. However, she highlights that research suggests that energy consumption is four to five times higher than that of a simple search engine query.
A notable study from 2019 examines a generative AI model called Bidirectional Encoder Representations from Transformers (BERT), consisting of 110 million parameters. This model consumed energy equivalent to a round-trip transcontinental flight for one person during its training process, utilizing graphics processing units (GPUs). Parameters, which guide the model’s predictions and increase complexity, are adjusted during training to reduce errors.
In comparison, Saenko reveals that OpenAI’s GPT-3 model, with a staggering 175 billion parameters, consumed energy equivalent to 123 gasoline-powered passenger vehicles driven for one year or approximately 1,287-megawatt hours of electricity. Additionally, it generated a staggering 552 tons of carbon dioxide. Remarkably, this energy expenditure occurred before any consumers even began utilizing the model.
With the rising popularity of AI chatbots, such as Perplexity AI and Microsoft’s ChatGPT integrated into Bing, the situation is further exacerbated by the release of mobile applications, making these technologies even more accessible to a broader audience.
Fortunately, Saenko highlights a study by Google that proposes various strategies to mitigate the carbon footprint. Employing more efficient model architectures, processors, and environmentally friendly data centers can substantially reduce energy consumption.
While a single large AI model may not single-handedly devastate the environment, Saenko warns that if numerous companies develop slightly different AI bots for various purposes, each catering to millions of customers, cumulative energy usage could become a significant concern.
Ultimately, Saenko suggests that further research is essential to enhancing the efficiency of generative AI. Encouragingly, she highlights the potential for AI to operate on renewable energy sources. By optimizing computation to coincide with the availability of green energy or locating data centers where renewable energy is abundant, emissions can be reduced by a remarkable factor of 30 to 40 compared to relying on fossil fuel-dominated grids.
In conclusion, while concerns about misinformation and job displacement due to AI persists, Professor Saenko’s emphasis on the environmental impact of generative AI tools raises a critical issue. It calls for increased research and innovative approaches to ensure that AI development aligns with sustainability goals. By doing so, we can harness the potential of AI while minimizing its carbon footprint, thus paving the way for a greener future.
Source: https://bitcoinworld.co.in/the-environmental-impact-of-artificial-intelligence-a-concern-beyond-misinformation-and-job-threats/