The Shocking Carbon Footprint of AI Models

Credit: datascience.aero

In recent news, NFTs, or non-fungible tokens, have garnered attention for their environmental impact. NFTs are digital assets that represent something in the real world like an art piece. This has now brought attention to the environmental impact of things on the internet and technology in general. As of late, AI models have been shown to use up vast amounts of energy in order to train itself for it’s express purpose.

How much energy is being used?

During the training of the AI model, a team of researchers led by Emma Strubel at the University of Massachusetts Amherst noticed that the AI model they were training used exorbitant amounts of energy. For an AI model to work for its intended purpose, the model has to be trained through various tests depending on the type of model it is and its purpose. In this situation, the team of researchers calculated that the AI model they were trained used thirty-nine pounds of carbon dioxide before the model was fully trained for its purpose.

This is similar to the emissions that a car releases in its lifetime five times over. The University of Massachusetts Amherst concluded that the training of just one neural network accounts for “roughly 626,000 pounds of carbon dioxide” released into the atmosphere.

This AI model is one out of hundreds that are releasing mass amounts of emissions that harm the environment. AI models have been used in medicine with chatbots being used to identify symptoms in patients along with a learning algorithm, created by researchers at Google AI Healthcare, being trained to identify breast cancer. However, in the coming years the environmental effects may counteract the good the model is trying to do.

What is the real harm? And how does this happen?

This energy usage with thousands of AI models coupled with the Earth’s already rising climate crisis may cause our emissions to reach an all-time high. With 2020 as one of the hottest years on record, the emissions from these models only add to the problem of climate change. Especially with the growth of the tech field, it is alarming to see the high emission rates of algorithmic technology.

The demand for AI to solve multitudes of problems will continue to grow and with this comes more and more data. Strubel concludes that “training huge models on tons of data is not feasible for academics.” This is due to the lack of advanced computers that are better suited to process these mass amounts of data. With these advanced computers, it would help to synthesize and process the information with generally less carbon output, according to Strubel and her team.

She goes on to say that as time goes on it becomes less feasible for researchers and students to be able to process mass amounts of data without these computers. Often in order to make groundbreaking studies, it comes at the cost of the environment, which is why advanced computers are necessary in order to be able to continue to make progress in the field of AI.

What are the solutions?

Currently, the best solution as proposed by the researcher is to invest in faster and more efficient computers to process these mass amounts of information. Funding for this type of research would help the computers process these mass amounts of data. It would cut down on the energy usage of these computers and lessen the environmental impact of training the AI models.

Credits to ofa.mit.edu

At MIT, the researchers there were able to cut down on their energy usage by utilizing the computers donated to them by IBM. Because of these computers, the researchers were able to process millions of calculations and write important papers in the AI field.

Another solution at MIT is OFA networks or one-for-all networks. This OFA network is meant to be a network that is trained to “support versatile architectural configurations.” Instead of using loads of energy to work individually for these models it uses a general network and uses specific aspects from the general network to support the software. This network helps to cut down on the overall cost of these models.

Though there are concerns over whether this can compromise the accuracy of the system the researchers provided testing on this to see if it was true. It was not and they found that the OFA network had no effect on the AI systems.

With these solutions, it is important to understand our future is not hopeless. Researchers are actively looking at ways to alleviate this issue and by using the correct plans and actions, the innovations of the future can help to better, not harm.

Artificial Intelligence & Protein Sequencing

Image Credit: DeepMind

Google-owned artificial intelligence firm DeepMind developed a system AlphaFold to solve the age-old ‘protein folding problem’ or answer the question of how a protein’s amino acid sequence dictates its 3D structure.

Proteins are made up of thousands of amino acids, and millions of small-scale interactions between molecules play into their 3D forms. Understanding the structure of just one protein requires years of work and highly specialized equipment. Thus, researchers have grappled with the complexity of protein folding for decades.

AlphaFold was trained on data from 170,000 known proteins, whose structures were deciphered the traditional way. Now, this technology has an average accuracy score of 92.4 out of 100 for predicting protein structure, and a score of 87 for more complex architectures.

Almost all diseases, such as Alzheimer’s disease, cancer, and COVID-19, involve protein structure, so AI opens the door for faster drug development and a better understanding of the biological processes underlying these health conditions.

Winner of the 2009 Nobel Prize in Chemistry, Venki Ramakrishnan, remarked, “This computational work represents a stunning advance on the protein-folding problem, a 50-year-old grand challenge in biology. It has occurred decades before many people in the field would have predicted. It will be exciting to see the many ways in which it will fundamentally change biological research.”

It will take some time to improve the algorithm’s predictive power and to bridge the gap between computer modeling and real-world pharmaceutical implementation, but AlphaFold will undoubtedly deepen our understanding of the role protein folding plays in a myriad of diseases.

Outside of medicine, AlphaFold could identify enzymes that break down plastic waste or capture carbon dioxide from the atmosphere, a useful tool in the long-standing battle against climate change.

How AI Could Help Predict and Reverse the Effects of Climate Change

Image credit: Foreign Affairs.

From Beijing to Great Britain, companies are using innovative new technologies to reverse the effects of climate change.

Google has been reshaping its data centers by lowering its total energy consumption using machine learning (ML), the study of computer algorithms that improve automatically through experience, which will be useful for both the climate and Google since the company plans to open more of these data centers. DeepMind, its London-based AI unit, is using information collected by sensors to reduce the data centers’ energy use for cooling by up to 40 percent. The same technology is also being implemented to prognosticate the clean energy output for Google so that the company can manage how much conventional energy it actually needs.

Cognitive computing, which describes technology platforms based on its scientific disciplines of AI and signal processing, along with superior data processing ability pairs up with the Internet of Things (IoT), a system of interrelated and internet-connected objects that are able to collect and transfer data wirelessly, to predict pollution rates in Beijing. The system uses ML to ingest data from sources such as meteorological satellites and traffic cameras to constantly learn and adjust the predictive models. It is able to forecast pollution 72 hours in advance, with accuracy down to the nearest kilometre to detect where the pollution is coming from and where it will likely go. Beijing’s government is using this methodology to reduce pollution levels ahead of the 2022 Winter Olympics. It can use these predictions to implement policies like temporarily restricting industrial activity or limiting traffic and construction in certain areas. Cognitive computing and ML also create models that will allow officials to test the effectiveness of such interventions.

The Allen Coral Atlas, an initiative committed to studying the evolving coral reef of the world’s oceans, is using satellite images and AI image processing to detect changes and locate the reefs that might face threats that emerge from global warming and ocean pollution.

In Singapore, a Digital Innovation Lab has been mastering emerging technologies to ensure the continuity of tech-based climate change initiatives. The lab is building technology that can optimize public transport routes and decrease carbon emissions from vehicles. It has also developed the technology to track the rise of sea levels and their impact on marine health. Another project is tracking food provenance, checking on the quality of nutrition and the chemical composition of the food. Making these technologies accessible to partners across Asia lowers the barrier for new agencies to use them. This should result in a boost in the number of agile project-management and design-thinking climate change solutions.

The push of using machine learning builds on the work already done by climate informatics, a discipline created in 2011 that sits at the intersection of data science and climate science. Climate informatics use data collected from things like ice cores, climate downscaling or using large-scale models to predict weather on a hyper-local level, and the socio-economic impacts of weather and climate. AI can also unlock new insights from the massive amounts of complex climate simulations generated by the field of climate modeling. Of the dozens of models that have since come into existence, all represent the atmosphere, oceans, land, cryosphere, or ice. But, even with agreement on basic scientific assumptions, Claire Monteleoni, a computer science professor at the University of Colorado, Boulder and a co-founder of climate informatics, points out that while the models generally agree in the short term, differences emerge when it comes to long-term forecasts. One project Monteleoni worked on uses machine learning algorithms to connect the predictions of the approximately 30 climate models used by the Intergovernmental Panel on Climate Change. Better forecasts can help officials make informed climate policy, enable governments to prepare for change, and potentially uncover areas that could modify some impacts of climate change.

Some homeowners have already experienced the effects of a changing environment. For others, it might seem less substantial. To make it more practical for people, researchers from Montreal Institute for Learning Algorithms (MILA), Microsoft, and ConscientAI Labs used Generative Adversarial Networks (GANs), a type of AI that generates new data with the same statistics as the training set, to simulate what homes are anticipated to look like after being damaged by rising sea levels and intense storms. So far, MILA researchers have met with Montreal city officials and Non-governmental organizations (NGOs) longing to use the tool. Future plans include releasing an app to show individuals what their neighborhoods and homes might look like in the future with different climate change outcomes. But the app will need more data, and eventually, let people upload photos of floods and forest fires to improve the algorithm.

Carbon Tracker is an independent financial think-tank working toward the UN goal of preventing new coal plants from being built by 2020. By monitoring coal plant emissions with satellite imagery, Carbon Tracker can utilize the data it gathers to convince the finance industry that carbon plants aren’t profitable. Google is expanding the nonprofit’s satellite imagery efforts to include gas-powered plants’ emissions and get a better understanding of where air pollution is growing from. While there are continuous monitoring systems near power plants that can measure CO2 emissions more conveniently, they do not have a global reach.

AI is a tool in our arsenal if we hope to achieve our UN -1.5 degrees goal and beyond; it acts as a fuse bomb to speed up the process of fighting climbing change by giving accurate and precise information about conflicting climatic factors around us. With AI on our side, we can defeat climate change in the long run.