Heavy Industry and Manufacturing
Workshop Papers
Venue | Title |
---|---|
NeurIPS 2024 |
DeepMyco - Dataset Generation for Dye Mycoremediation
(Proposals Track)
Abstract and authors: (click to expand)Abstract: Textile dyes comprise 20% of global water pollution. Mycoremediation, a promising approach utilizing cheap, naturally growing fungi, has not seen scale production. While numerous studies indicate benefits, it is challenging to apply the specific learnings of each study to the combination of environmental factors present in a given physical site - a gap we believe machine learning can help fill if datasets become available. We propose an approach to drive machine learning research in mycoremediation by contributing a comprehensive dataset. We propose using advanced language models and vision transformers to extract and categorize experimental data from various research papers. This dataset will enable ML-driven innovation in matching fungi to specific dye types, optimizing remediation processes, and scaling up mycoremediation efforts effectively. Authors: Danika Gupta (The Harker Upper School) |
NeurIPS 2021 |
Amortized inference of Gaussian process hyperparameters for improved concrete strength trajectory prediction
(Papers Track)
Abstract and authors: (click to expand)Abstract: Designing and utilizing alternative concrete formulations which supplant the use of ordinary portland cement with alternative binders have been identified as central goals in reducing the greenhouse gas impact of the concrete industry. Given the variability in availability and quality of alternatives, these goals call for an optimal design of experiment approach to designing formulations, which can be adapted to local needs. The realization of this goal hinges on an ability to predict key properties. Here, we present and benchmark a Gaussian process (GP) model for predicting the trajectory of concrete strength, an essential performance measure. GPs are a desirable model class for the application because of their ability to estimate uncertainty and update predictions given additional data. In this work, rather than manually tuning hyperparameters for different concrete mix models, we propose a new method based on amortized inference leveraging mixture attributes, leading to models which are better fit for use in Bayesian optimization of concrete formulation. We demonstrate the success of the approach using a large, industrial concrete dataset. Authors: Kristen Severson (Microsoft Research); Olivia Pfeiffer (MIT); Jie Chen (IBM Research); Kai Gong (MIT); Jeremy Gregory (Massachusetts Institute of Technology); Richard Goodwin (IBM Research); Elsa Olivetti (Massachusetts Institute of Technology) |
ICML 2021 |
Estimation of Corporate Greenhouse Gas Emissions via Machine Learning
(Papers Track)
Abstract and authors: (click to expand)Abstract: As an important step to fulfill the Paris Agreement and achieve net-zero emissions by 2050, the European Commission adopted the most ambitious package of climate impact measures in April 2021 to improve the flow of capital towards sustainable activities. For these and other international measures to be successful, reliable data is key. The ability to see the carbon footprint of companies around the world will be critical for investors to comply with the measures and hit climate neutrality. However, with only a small portion of companies volunteering to disclose their greenhouse gas (GHG) emissions, it is nearly impossible for investors to align their investment strategies with the measures. By training a machine learning model on disclosed GHG emissions, we are able to estimate the emissions of other companies globally who do not disclose their emissions. In this paper, we show that our model provides accurate estimates of corporate GHG emissions to investors such that they are able to align their investments with the regulatory measures and achieve net-zero goals. Authors: You Han (Bloomberg L.P.); Achintya Gopal (Bloomberg LP); Liwen Ouyang (Bloomberg L.P.); Aaron Key (Bloomberg LP) |
ICML 2021 |
Probabilistic Short-Term Low-Voltage Load Forecasting using Bernstein-Polynomial Normalizing Flows
(Papers Track)
Abstract and authors: (click to expand)Abstract: The transition to a fully renewable energy grid requires better forecasting of demand at the low-voltage level. However, high fluctuations and increasing electrification cause huge forecast errors with traditional point estimates. Probabilistic load forecasts take future uncertainties into account and thus enables various applications in low-carbon energy systems. We propose an approach for flexible conditional density forecasting of short-term load based on Bernstein-Polynomial Normalizing Flows where a neural network controls the parameters of the flow. In an empirical study with 363 smart meter customers, our density predictions compare favorably against Gaussian and Gaussian mixture densities and also outperform a non-parametric approach based on the pinball loss for 24h-ahead load forecasting for two different neural network architectures. Authors: Marcel Arpogaus (Konstanz University of Applied Sciences); Marcus Voß (Technische Universität Berlin (DAI-Labor)); Beate Sick (ZHAW and University of Zurich); Mark Nigge-Uricher (Bosch.IO GmbH); Oliver Dürr (Konstanz University of Applied Sciences) |
ICML 2021 |
Reducing Carbon in the Design of Large Infrastructure Scheme with Evolutionary Algorithms
(Papers Track)
Abstract and authors: (click to expand)Abstract: The construction and operations of large infrastructure schemes such as railways, roads, pipelines and power lines account for a significant proportion of global carbon emissions. Opportunities to reduce the embodied and operational carbon emissions of new infrastructure schemes are greatest during the design phase. However, schedule and cost constraints limit designers from assessing a large number of design options in detail to identify the solution with the lowest lifetime carbon emissions using conventional methods. Here, we develop an evolutionary algorithm to rapidly evaluate in detail the lifetime carbon emissions of thousands of possible design options for new water transmission pipeline schemes. Our results show that this approach can help designers in some cases to identify design solutions with more than 10% lower operational carbon emissions compared with conventional methods, saving more than 1 million tonnes in lifetime carbon emissions for a new water transmission pipeline scheme. We also find that this evolutionary algorithm can be applied to design other types of infrastructure schemes such as non-water pipelines, railways, roads and power lines. Authors: Matt Blythe (Continuum Industries) |
NeurIPS 2020 |
Characterization of Industrial Smoke Plumes from Remote Sensing Data
(Papers Track)
Abstract and authors: (click to expand)Abstract: The major driver of global warming has been identified as the anthropogenic release of greenhouse gas (GHG) emissions from industrial activities. The quantitative monitoring of these emissions is mandatory to fully understand their effect on the Earth’s climate and to enforce emission regulations on a large scale. In this work, we investigate the possibility to detect and quantify industrial smoke plumes from globally and freely available multi-band image data from ESA’s Sentinel-2 satellites. Using a modified ResNet-50, we can detect smoke plumes of different sizes with an accuracy of 94.3%. The model correctly ignores natural clouds and focuses on those imaging channels that are related to the spectral absorption from aerosols and water vapor, enabling the localization of smoke. We exploit this localization ability and train a U-Net segmentation model on a labeled sub-sample of our data, resulting in an Intersection-over-Union (IoU) metric of 0.608 and an overall accuracy for the detection of any smoke plume of 94.0%; on average, our model can reproduce the area covered by smoke in an image to within 5.6%. The performance of our model is mostly limited by occasional confusion with surface objects, the inability to identify semi-transparent smoke, and human limitations to properly identify smoke based on RGB-only images. Nevertheless, our results enable us to reliably detect and qualitatively estimate the level of smoke activity in order to monitor activity in industrial plants across the globe. Our data set and code base are publicly available. Authors: Michael Mommert (University of St. Gallen); Mario Sigel (Sociovestix Labs Ltd.); Marcel Neuhausler (ISS Technology Innovation Lab); Linus M. Scheibenreif (University of St. Gallen); Damian Borth (University of St. Gallen) |
NeurIPS 2020 |
Revealing the Oil Majors' Adaptive Capacity to the Energy Transition with Deep Multi-Agent Reinforcement Learning
(Papers Track)
Abstract and authors: (click to expand)Abstract: A low-carbon energy transition is transpiring to combat climate change, posing an existential threat to oil and gas companies, particularly the Majors. Though Majors yield the resources and expertise to adapt to low-carbon business models, meaningful climate-aligned strategies have yet to be enacted. A 2-degrees pathways (2DP) wargame was developed to assess climate-compatible pathways for the oil Majors. Recent advances in deep multi-agent reinforcement learning (MARL) have achieved superhuman-level performance in solving high-dimensional continuous control problems. Modeling within a Markovian framework, we present the novel 2DP-MARL model which applies deep MARL methods to solve the 2DP wargame across a multitude of transition scenarios. Designed to best mimic Majors in real- life competition, the model reveals all Majors quickly adapt to low-carbon business models to remain robust amidst energy transition uncertainty. The purpose of this work is provide tangible metrics to support the call for oil Majors to diversify into low-carbon business models and, thus, accelerate the energy transition. Authors: Dylan Radovic (Imperial College London); Lucas Kruitwagen (University of Oxford); Christian Schroeder de Witt (University of Oxford) |
NeurIPS 2020 |
Machine learning for advanced solar cell production: adversarial denoising, sub-pixel alignment and the digital twin
(Papers Track)
Abstract and authors: (click to expand)Abstract: Photovoltaic is a main pillar to achieve the transition towards a renewable energy supply. In order to continue the tremendous cost decrease of the last decades, novel cell techologies and production processes are implemented into mass production to improve cell efficiency. Raising their full potential requires novel techniques of quality assurance and data analysis. We present three use-cases along the value chain where machine learning techniques are investigated for quality inspection and process optimization: Adversarial learning to denoise wafer images, alignment of surface structuring processes via sub-pixel coordinate regression, and the development of a digital twin for wafers and solar cells for material and process analysis. Authors: Matthias Demant (Fraunhofer ISE); Leslie Kurumundayil (Fraunhofer ISE); Philipp Kunze (Fraunhofer ISE); Aditya Kovvali (Fraunhofer ISE); Alexandra Woernhoer (Fraunhofer ISE); Stefan Rein (Fraunhofer ISE) |
NeurIPS 2020 |
Automated Salmonid Counting in Sonar Data
(Papers Track)
Abstract and authors: (click to expand)Abstract: The prosperity of salmonids is crucial for several ecological and economic functions. Accurately counting spawning salmonids during their seasonal migration is essential in monitoring threatened populations, assessing the efficacy of recovery strategies, guiding fishing season regulations, and supporting the management of commercial and recreational fisheries. While several different methods exist for counting river fish, they all rely heavily on human involvement, introducing a hefty financial and time burden. In this paper we present an automated fish counting method that utilizes data captured from ARIS sonar cameras to detect and track salmonids migrating in rivers. Our results show that our fully automated system has a 19.3% per-clip error when compared to human counting performance. There is room to improve, but our system can already decrease the amount of time field biologists and fishery managers need to spend manually watching ARIS clips. Authors: Peter Kulits (Caltech); Angelina Pan (Caltech); Sara M Beery (Caltech); Erik Young (Trout Unlimited); Pietro Perona (California Institute of Technology); Grant Van Horn (Cornell University) |
NeurIPS 2020 |
Emerging Trends of Sustainability Reporting in the ICT Industry: Insights from Discriminative Topic Mining
(Papers Track)
Abstract and authors: (click to expand)Abstract: The Information and Communication Technologies (ICT) industry has a considerable climate change impact and accounts for approximately 3 percent of global carbon emissions. Despite the increasing availability of sustainability reports provided by ICT companies, we still lack a systematic understanding of what has been disclosed at an industry level. In this paper, we make the first major effort to use modern unsupervised learning methods to investigate the sustainability reporting themes and trends of the ICT industry over the past two decades. We build a cross-sector dataset containing 22,534 environmental reports from 1999 to 2019, of which 2,187 are ICT specific. We then apply CatE, a text embedding based topic modeling method, to mine specific keywords that ICT companies use to report on climate change and energy. As a result, we identify (1) important shifts in ICT companies' climate change narratives from physical metrics towards climate-related disasters, (2) key organizations with large influence on ICT companies, and (3) ICT companies' increasing focus on data center and server energy efficiency. Authors: Lin Shi (Stanford University); Nhi Truong Vu (Stanford University) |
ICLR 2020 |
DETECTION OF HOUSING AND AGRICULTURE AREAS ON DRY-RIVERBEDS FOR THE EVALUATION OF RISK BY LANDSLIDES USING LOW-RESOLUTION SATELLITE IMAGERY BASED ON DEEP LEARNING. STUDY ZONE: LIMA, PERU
(Papers Track)
Abstract and authors: (click to expand)Abstract: The expansion of human settlements in Peru has caused risk exposure to landslides. However, this risk could increase because the intensity of the El niño phenomenon will be greater in the coming years, increasing rainfall on the Peruvian coast. In this paper, we present a novel methodology for detecting housing areas and agricultural lands in low-resolution satellite imagery in order to analyze potential risk in case of unexpected landslides. It was developed by creating two datasets from Lima Metropolitana in Peru, one of which is for detecting dry riverbeds and agriculture lands, and the other for classifying housing areas. We applied data augmentation based on geometrical methods and trained architectures based on U-net methods separately and then, overlap the results for risk assessment. We found that there are areas with significant potential risk that have been classified by the Peruvian government as medium or low risk areas. On this basis, it is recommended obtain a dataset with better resolution that can identify how many housing areas will be affected and take the appropriate prevention measures. Further research in post-processing is needed for suppress noise in our results. Authors: Brian Cerrón (National University of Engineering); Cristopher Bazan (National University of Engineering); Alberto Coronado (National University of Engineering) |
ICLR 2020 |
Accelerated Data Discovery for Scalable Climate Action
(Proposals Track)
Abstract and authors: (click to expand)Abstract: According to the Intergovernmental Panel on Climate Change (IPCC), the planet must decarbonize by 50% by 2030 in order to keep global warming below 1.5C. This goal calls for a prompt and massive deployment of solutions in all societal sectors - research, governance, finance, commerce, health care, consumption. One challenge for experts and non-experts is access to the rapidly growing body of relevant information, which is currently scattered across many weakly linked domains of expertise. We propose a large-scale, semi-automatic, AI-based discovery system to collect, tag, and semantically index this information. The ultimate goal is a near real-time, partially curated data catalog of global climate information for rapidly scalable climate action. Authors: Henning Schwabe (Private); Sumeet Sandhu (Elementary IP LLC); Sergy Grebenschikov (Private) |
ICLR 2020 |
Advancing Renewable Electricity Consumption With Reinforcement Learning
(Proposals Track)
Abstract and authors: (click to expand)Abstract: As the share of renewable energy sources in the present electric energy mix rises, their intermittence proves to be the biggest challenge to carbon free electricity generation. To address this challenge, we propose an electricity pricing agent, which sends price signals to the customers and contributes to shifting the customer demand to periods of high renewable energy generation. We propose an implementation of a pricing agent with a reinforcement learning approach where the environment is represented by the customers, the electricity generation utilities and the weather conditions. Authors: Filip Tolovski (Fraunhofer Heinrich-Hertz-Institut) |
NeurIPS 2019 |
Machine learning identifies the most valuable synthesis conditions for next-generation photovoltaics
(Papers Track)
Best Paper Award
Abstract and authors: (click to expand)Abstract: Terawatts of next-generation photovoltaics (PV) are necessary to mitigate climate change. The traditional R&D paradigm leads to high efficiency / high variability solar cells, limiting industrial scaling of novel PV materials. In this work, we propose a machine learning approach for early-stage optimization of solar cells, by combining a physics-informed deep autoencoder and a manufacturing-relevant Bayesian optimization objective. This framework allows to: 1) Co-optimize solar cell performance and variability under techno-economic revenue constrains, and 2) Infer the effect of process conditions over key latent physical properties. We test our approach by synthesizing 135 perovskite solar cells, and finding the optimal points under various techno-economic assumptions. Authors: Felipe Oviedo (MIT) and Zekun Ren (MIT) |
ICML 2019 |
Machine Learning for AC Optimal Power Flow
(Research Track)
Honorable Mention
Abstract and authors: (click to expand)Abstract: F( We explore machine learning methods for AC Optimal Powerflow (ACOPF) - the task of optimizing power generation in a transmission network according while respecting physical and engineering constraints. We present two formulations of ACOPF as a machine learning problem: 1) an end-to-end prediction task where we directly predict the optimal generator settings, and 2) a constraint prediction task where we predict the set of active constraints in the optimal solution. We validate these approaches on two benchmark grids. Authors: Neel Guha (Carnegie Mellon University); Zhecheng Wang (Stanford University); Arun Majumdar (Stanford University) |
ICML 2019 |
The Impact of Feature Causality on Normal Behaviour Models for SCADA-based Wind Turbine Fault Detection
(Research Track)
Abstract and authors: (click to expand)Abstract: The cost of wind energy can be reduced by using SCADA data to detect faults in wind turbine components. Normal behavior models are one of the main fault detection approaches, but there is a lack of work in how different input features affect the results. In this work, a new taxonomy based on the causal relations between the input features and the target is presented. Based on this taxonomy, the impact of different input feature configurations on the modelling and fault detection performance is evaluated. To this end, a framework that formulates the detection of faults as a classification problem is also presented. Authors: Telmo Felgueira (IST) |
ICML 2019 |
Autopilot of Cement Plants for Reduction of Fuel Consumption and Emissions
(Deployed Track)
Abstract and authors: (click to expand)Abstract: The cement manufacturing industry is an essential component of the global economy and infrastructure. However, cement plants inevitably produce hazardous air pollutants, including greenhouse gases, and heavy metal emissions as byproducts of the process. Byproducts from cement manufacturing alone accounts for approximately 5% of global carbon dioxide (CO2) emissions. We have developed "Autopilot" - a machine learning based Software as a Service (SaaS) to learn manufacturing process dynamics and optimize the operation of cement plants - in order to reduce the overall fuel consumption and emissions of cement production. Autopilot is able to increase the ratio of alternative fuels (including biowaste and tires) to Petroleum coke, while optimizing operation of pyro, the core process of cement production that includes the preheater, kiln and cooler. Emissions of gases such as NOx and SOx, and heavy metals such as mercury and lead which are generated through burning petroleum coke can be reduced through the use of Autopilot. Our system has been proven to work in real world deployments and an analysis of cement plant performance with Autopilot enabled shows energy consumption savings and a decrease of up to 28,000 metric tons of CO2 produced per year. Authors: Prabal Acharyya (Petuum Inc); Sean D Rosario (Petuum Inc); Roey Flor (Petuum Inc); Ritvik Joshi (Petuum Inc); Dian Li (Petuum Inc); Roberto Linares (Petuum Inc); Hongbao Zhang (Petuum Inc) |
ICML 2019 |
Towards a Sustainable Food Supply Chain Powered by Artificial Intelligence
(Deployed Track)
Honorable Mention
Abstract and authors: (click to expand)Abstract: About 30-40% of food produced worldwide is wasted. This puts a severe strain on the environment and represents a $165B loss to the US economy. This paper explores how artificial intelligence can be used to automate decisions across the food supply chain in order to reduce waste and increase the quality and affordability of food. We focus our attention on supermarkets — combined with downstream consumer waste, these contribute to 40% of total US food losses — and we describe an intelligent decision support system for supermarket operators that optimizes purchasing decisions and minimizes losses. The core of our system is a model-based reinforcement learn- ing engine for perishable inventory management; in a real-world pilot with a US supermarket chain, our system reduced waste by up to 50%. We hope that this paper will bring the food waste problem to the attention of the broader machine learning research community. Authors: Volodymyr Kuleshov (Stanford University) |
ICML 2019 |
ML-driven search for zero-emissions ammonia production materials
(Ideas Track)
Abstract and authors: (click to expand)Abstract: Ammonia (NH3) production is an industrial process that consumes between 1-2% of global energy annually and is responsible for 2-3% of greenhouse gas emissions (Van der Ham et al.,2014). Ammonia is primarily used for agricultural fertilizers, but it also conforms to the US-DOE targets for hydrogen storage materials (Lanet al., 2012). Modern industrial facilities use the century-old Haber-Bosch process, whose energy usage and carbon emissions are strongly dominated by the use of methane as the combined energy source and hydrogen feedstock, not by the energy used to maintain elevated temperatures and pressures (Pfromm, 2017). Generating the hydrogen feedstock with renewable electricity through water electrolysis is an option that would allow retrofitting the billions of dollars of invested capital in Haber-Bosch production capacity. Economic viability is however strongly dependent on the relative regional prices of methane and renewable energy; renewables have been trending lower in cost but forecasting methane prices is difficult (Stehly et al., 2018; IRENA, 2017; Wainberg et al., 2017). Electrochemical ammonia production, which can use aqueous or steam H2O as its hydrogen source (first demonstrated ̃20years ago) is a promising means of emissions-free ammonia production. Its viability is also linked to the relative price of renewable energy versus methane, but in principle it can be significantly more cost-effective than Haber-Bosch (Giddeyet al., 2013) and also downscale to developing areas lacking ammonia transport infrastructure(Shipman & Symes, 2017). However to date it has only been demonstrated at laboratory scales with yields and Faradaic efficiencies insufficient to be economically competitive. Promising machine-learning approaches to fix this are discussed. Authors: Kevin McCloskey (Google) |
ICML 2019 |
Meta-Optimization of Optimal Power Flow
(Ideas Track)
Abstract and authors: (click to expand)Abstract: The planning and operation of electricity grids is carried out by solving various forms of con- strained optimization problems. With the increasing variability of system conditions due to the integration of renewable and other distributed energy resources, such optimization problems are growing in complexity and need to be repeated daily, often limited to a 5 minute solve-time. To address this, we propose a meta-optimizer that is used to initialize interior-point solvers. This can significantly reduce the number of iterations to converge to optimality. Authors: Mahdi Jamei (Invenia Labs); Letif Mones (Invenia Labs); Alex Robson (Invenia Labs); Lyndon White (Invenia Labs); James Requeima (Invenia Labs); Cozmin Ududec (Invenia Labs) |