Can federated learning save the world?


12-05-2021
  Machine Learning & Artificial Intelligence  Credit: Image via www.vpnsrus.com

Training the artificial intelligence models that underpin web search engines, power smart assistants and enable driverless cars, consumes megawatts of energy and generates worrying carbon dioxide emissions. But new ways of training these models are proven to be greener.  

Artificial intelligence models are used increasingly widely in today’s world. Many carry out natural language processing tasks – such as language translation, predictive text and email spam filters. They are also used to empower smart assistants such as Siri and Alexa to ‘talk’ to us, and to operate driverless cars.

But to function well these models have to be trained on large sets of data, a process that includes carrying out many mathematical operations for every piece of data they are fed. And the data sets they are being trained on are getting ever larger: one recent natural language processing model was trained on a data set of 40 billion words.

As a result, the energy consumed by the training process is soaring. Most AI models are trained on specialised hardware in large data centres. According to a recent paper in the journal Science, the total amount of energy consumed by data centres made up about 1% of global energy use over the past decade – equalling roughly 18 million US homes. And in 2019, a group of researchers at the University of Massachusetts estimated that training one large AI model used in natural language processing could generate around the same amount of CO2 emissions as five cars would generate over their total lifetime.

Concerned by this, researchers in Cambridge's Department of Computer Science and Technology set out to investigate more energy-efficient approaches to training AI models. Working with collaborators at the University of Oxford, University College London, and Avignon Université, they explored the environmental impact of a different form of training – called federated learning – and discovered that it had a significantly greener impact.

Instead of training the models in data centres, federated learning involves training models across a large number of individual machines. The researchers found that this can lead to lower carbon emissions than traditional learning.  

Read the full story

Image:  Machine Learning & Artificial Intelligence

Credit: Image via www.vpnsrus.com

Reproduced courtesy of the University of Cambridge

 

The University of Cambridge is acknowledged as one of the world's leading higher education and research institutions. The University was instrumental in the formation of the Cambridge Network and its Vice- Chancellor, Professor Stephen Toope, is also the President of the Cambridge Network.

University of Cambridge (cam.ac.uk)