The latest figures from the Central Statistics Office showed that data centres now account for 21% of electricity consumed in Ireland, overtaking all domestic residences combined. This level of consumption places a strain on the grid, increases the likelihood of power shortages, and puts Ireland's climate targets in jeopardy.

The issue is not unique to Ireland. John Pettigrew from the National Grid in the UK recently noted that "future growth in foundational technologies like artificial intelligence will mean larger scale, energy-intensive computing infrastructure". 

There is ever increasing demand for mobile and wifi connectivity: ComReg recently forecasted that mobile data traffic will grow by an average of 16.5% a year up to 2028. 

One of the main drivers of data centre power consumption is artificial intelligence. The figure for energy consumption by data centres in Ireland is predicted to rise to 30% driven by the computing requirements of AI.

Main limitation

When the idea of the information superhighway (or internet) was first introduced to the public on BBC TV show Tomorrow's World in 1994, the presenter identified the electronic cables used for information transfer as the main limitation.

She mentioned that it was not possible to view a video on the internet at that point in time and the pages she shows viewers had to be loaded some time in advance of filming.

She then explains how optical communications would be the solution to unleash the full potential of the internet. Users, the presenter says, "could transform [their homes] into a mammoth interactive entertainment centre".

Sound familiar? 30 years later thanks to fibre optics, the internet has completely altered the way society interacts and works, and it is thanks in no small part to optical technology. As we stand on the cusp of the AI revolution, it will once again be optics and photonics that allows AI to reach the level of maturity to significantly enhance the way we work and live. 

As discussions about the pros and cons of artificial intelligence continue, there is no doubt it is starting to become more prevalent in our daily lives. It has the potential to revolutionise society in the coming decades as the internet has done since the 1990s.

For most people, AI is essentially a software tool, but the advanced AI capabilities delivered by the latest tools such as ChatGBTGoogle Gemini or Microsoft's CoPilot rely on sophisticated hardware infrastructure. This consists of massive supercomputers, equivalent to medium scale data centres, which can transfer and process huge amounts of information to generate the AI models.

Taking Chat GPT as an example, OpenAI used a supercomputer with 25,000 graphical processing units (GPUs) from Nvidia and 100s of gigabits per second of network connectivity for each GPU server to train the model.

The supercomputer was built by Microsoft specifically for OpenAI’s training needs using powerful data processors from Nvidia, as they provide the ability to handle multiple mathematical calculations simultaneously and in parallel.

It took several months, 50GWh of energy and about $100m to train Chat GPT4, with the training time and energy consumption highly dependent on how information is transferred between the GPU’s and switches within the supercomputer. 

In order to train new and more complex AI models (with reduced timeframes and energy), Nvidia and other industry players are moving from standard electrical interconnects between GPUs to optical interconnects, which can simultaneously reduce energy consumption levels and increase capacity.

In simple terms, we can transfer orders of magnitude more information with much lower loss, and thus energy consumption, by using light (photons) instead of electricity (electrons).

The key component for these optical interconnects is the laser diode which generates the optical information signals. It is anticipated that one billion lasers per year could be required by the end of the decade to support the demand for optical interconnects within supercomputers and data centres required for AI.

DCU researchers are collaborating with commercial and academic partners in Ireland and the EU on ground-breaking laser technology to increase the capacity while reducing the energy consumption of data centres used for training new AI models.

Photonic technologies

The team is aiming to create a commercially viable optical interconnect solution within five years which could increase the energy efficiency of a data centre by 70 to 80%, even with partial replacement of electrical interconnects with photonic technologies. 

Recent tests have shown that the new technology can transfer data with minimal loss over distances up to 10km, achieving data rates of 1 Terabit/s from a single laser source, thus demonstrating the potential to link different data centres in addition to the GPUs within one structure. In contrast, electrical interconnects can only facilitate such high speed data transfer over a few centimetres, due to high loss.

As part of these research programmes, we are developing innovative optical interconnects employing novel laser technologies to increase energy efficiency, reliability and capacity of the interconnects used in data centres.

The reliability of the optical technology is vital as the optical sources tend to be the most vulnerable part of all semiconductor components used in such systems.

For data centre networks dealing with AI applications, the reliability requirements are noticeably higher than for standard data centres, as training processes should not be interrupted by a single failure.

These small yet fundamental parts of the infrastructure could be the key to the next generation of digital innovations, AI included. From the basic use of AI to draft text on a specific topic or generate video content, to more complex tasks such as developing new drug formulas or self-driving cars, AI has the potential to revolutionise society.

The ever growing list of its applications is dizzying – so much so that it can be easy to lose sight of the photonic innovations which will provide the foundation for these advances. 

Author: , DCU. This article first appeared on RTÉs Brainstorm.