From Beijing to Great Britain, companies are using innovative new technologies to reverse the effects of climate change.
Google has been reshaping its data centers by lowering its total energy consumption using machine learning (ML), the study of computer algorithms that improve automatically through experience, which will be useful for both the climate and Google since the company plans to open more of these data centers. DeepMind, its London-based AI unit, is using information collected by sensors to reduce the data centers’ energy use for cooling by up to 40 percent. The same technology is also being implemented to prognosticate the clean energy output for Google so that the company can manage how much conventional energy it actually needs.
Cognitive computing, which describes technology platforms based on its scientific disciplines of AI and signal processing, along with superior data processing ability pairs up with the Internet of Things (IoT), a system of interrelated and internet-connected objects that are able to collect and transfer data wirelessly, to predict pollution rates in Beijing. The system uses ML to ingest data from sources such as meteorological satellites and traffic cameras to constantly learn and adjust the predictive models. It is able to forecast pollution 72 hours in advance, with accuracy down to the nearest kilometre to detect where the pollution is coming from and where it will likely go. Beijing’s government is using this methodology to reduce pollution levels ahead of the 2022 Winter Olympics. It can use these predictions to implement policies like temporarily restricting industrial activity or limiting traffic and construction in certain areas. Cognitive computing and ML also create models that will allow officials to test the effectiveness of such interventions.
The Allen Coral Atlas, an initiative committed to studying the evolving coral reef of the world’s oceans, is using satellite images and AI image processing to detect changes and locate the reefs that might face threats that emerge from global warming and ocean pollution.
In Singapore, a Digital Innovation Lab has been mastering emerging technologies to ensure the continuity of tech-based climate change initiatives. The lab is building technology that can optimize public transport routes and decrease carbon emissions from vehicles. It has also developed the technology to track the rise of sea levels and their impact on marine health. Another project is tracking food provenance, checking on the quality of nutrition and the chemical composition of the food. Making these technologies accessible to partners across Asia lowers the barrier for new agencies to use them. This should result in a boost in the number of agile project-management and design-thinking climate change solutions.
The push of using machine learning builds on the work already done by climate informatics, a discipline created in 2011 that sits at the intersection of data science and climate science. Climate informatics use data collected from things like ice cores, climate downscaling or using large-scale models to predict weather on a hyper-local level, and the socio-economic impacts of weather and climate. AI can also unlock new insights from the massive amounts of complex climate simulations generated by the field of climate modeling. Of the dozens of models that have since come into existence, all represent the atmosphere, oceans, land, cryosphere, or ice. But, even with agreement on basic scientific assumptions, Claire Monteleoni, a computer science professor at the University of Colorado, Boulder and a co-founder of climate informatics, points out that while the models generally agree in the short term, differences emerge when it comes to long-term forecasts. One project Monteleoni worked on uses machine learning algorithms to connect the predictions of the approximately 30 climate models used by the Intergovernmental Panel on Climate Change. Better forecasts can help officials make informed climate policy, enable governments to prepare for change, and potentially uncover areas that could modify some impacts of climate change.
Some homeowners have already experienced the effects of a changing environment. For others, it might seem less substantial. To make it more practical for people, researchers from Montreal Institute for Learning Algorithms (MILA), Microsoft, and ConscientAI Labs used Generative Adversarial Networks (GANs), a type of AI that generates new data with the same statistics as the training set, to simulate what homes are anticipated to look like after being damaged by rising sea levels and intense storms. So far, MILA researchers have met with Montreal city officials and Non-governmental organizations (NGOs) longing to use the tool. Future plans include releasing an app to show individuals what their neighborhoods and homes might look like in the future with different climate change outcomes. But the app will need more data, and eventually, let people upload photos of floods and forest fires to improve the algorithm.
Carbon Tracker is an independent financial think-tank working toward the UN goal of preventing new coal plants from being built by 2020. By monitoring coal plant emissions with satellite imagery, Carbon Tracker can utilize the data it gathers to convince the finance industry that carbon plants aren’t profitable. Google is expanding the nonprofit’s satellite imagery efforts to include gas-powered plants’ emissions and get a better understanding of where air pollution is growing from. While there are continuous monitoring systems near power plants that can measure CO2 emissions more conveniently, they do not have a global reach.
AI is a tool in our arsenal if we hope to achieve our UN -1.5 degrees goal and beyond; it acts as a fuse bomb to speed up the process of fighting climbing change by giving accurate and precise information about conflicting climatic factors around us. With AI on our side, we can defeat climate change in the long run.