- Sectors
- Aerospace & Defense
- Big science
- Biotechnology
- Fintech
- Insights
Artificial Intelligence is revolutionizing countless industries, but its exponential growth also brings a critical challenge: high power consumption. AI is having a profound impact on global energy demand, both during the training phase and when models are running. This is leading to significant increases in CO2 emissions, contributing to climate change. For example, Google’s emissions have increased by 48% in recent years due to the rise of AI and its generative models.
If this trend continues, AI could consume as much electricity annually as the entire country of Argentina by 2027. But why does AI consume so many resources? Several factors contribute to this, including the complexity and size of the models, the vast amounts of data required for training, and the computational demands of training and execution.
The training phase of AI systems is the most energy-intensive. For instance, GPT-4, one of OpenAI’s most advanced models, consumed 400 MWh during its training. This is equivalent to the energy consumption of 40 Spanish households for an entire year. This figure greatly surpasses the energy consumption of previous versions. The annual consumption of ChatGPT is equivalent to 226.82 million kWh, which translates, for example, into 37,800 electric vehicles driving 15,000 kilometers each year.
In light of this issue, quantum technologies emerge as a possible solution to optimize power consumption and make AI more efficient and sustainable.
Quantum computing, sensors, and cryptography are some of the quantum technologies that take advantage of quantum mechanics principles to develop systems with far superior capabilities compared to classical computing.
Quantum computers are one of the most revolutionary applications of this technology. Unlike traditional computers, which store and process information using binary code (with values of 0 and 1), quantum devices use qubits, also known as quantum bits. What makes qubits special is their ability to exist in a superposition of states, meaning they can represent multiple values simultaneously instead of being limited to just one. It’s like flipping a coin, where, instead of landing on heads or tails, it stays in an indefinite state, as if the coin never landed.
In addition to quantum superposition, entanglement is another key phenomenon. When two or more qubits become entangled, their states become intrinsically connected, regardless of the distance between them. Qubits work together much more efficiently than classical bits, enabling complex calculations to be performed in less time.
Quantum computing employs quantum gates to modify the states of quantum bits, operating on qubits in superposition and entanglement, enabling more advanced and efficient mathematical transformations.
But how can this be applied to AI to make it more energy-efficient?
To reduce AI’s energy consumption, the first step is to optimize the efficiency of models, followed by balancing it with the responsible use of renewable energy sources. Quantum computing is on track to become one of the key solutions to reduce this consumption due to the way it processes data. Quantum bits (qubits) allow for processing vast amounts of information with much higher efficiency, directly impacting energy efficiency. They can perform parallel calculations on a much larger and more efficient scale than classical processors, accelerating tasks that today require a massive amount of computational resources.
Quantum models are emerging as the solution to enable the training of AI models using a fraction of the energy required by traditional models. This consumption could be reduced by up to 90% through much more efficient calculations. Quantum algorithms, such as the Quantum Approximate Optimization Algorithm (QAOA), the Variational Quantum Eigensolver (VQE), and the more recent Quantum Neural Networks (QNN), can reduce the number of operations needed to optimize AI models.
A quantum computer can solve certain problems in a matter of seconds. This level of efficiency significantly reduces the number of computational cycles required, lowering energy consumption and the carbon footprint of data centers.
This could lead to a radical reduction in the computational power required to train and run complex AI models. Quantum computing can significantly enhance the performance of AI neural networks in tasks such as natural language processing and image analysis. Moreover, recent research has shown that, when run on quantum processors, neural networks require less data to generalize and achieve the same level of performance as classical models. This means that training quantum neural networks could become efficient not only in terms of energy (computation) but also in terms of data usage.
Despite its promising advantages, quantum computing still faces several significant challenges and limitations:
While the integration of quantum computing into AI still has a long road ahead, its potential is undeniable. Not only could it enhance the accuracy and speed of AI models, but it could also drastically reduce the energy consumption of current systems.
ARQUIMEA Research Center has a dedicated quantum technologies division, which includes, for instance, a research project focused on advancing the scientific state of the art in the formulation of frugal deep neural networks on quantum processors.
Our commitment is to support the development of these emerging technologies by laying the groundwork for their future integration into real-world applications. We believe that as quantum computing progresses, it will unlock new possibilities for AI and other scientific fields—transforming the way we tackle complex global challenges.
Moreover, all ARQUIMEA Research Center projects are part of the QCIRCLE initiative, funded by the European Union with the goal of establishing a center of scientific excellence in Spain.