SETO 2020 – Artificial Intelligence Applications in Solar Energy

The Solar Energy Technologies Office Fiscal Year 2020 (SETO 2020) funding program supports projects that will improve the affordability, reliability, and value of solar technologies on the U.S. grid and tackle emerging challenges in the solar industry. This program funds projects that advance early-stage photovoltaic (PV), concentrating solar-thermal power (CSP), and systems integration technologies, and reduce the non-hardware costs associated with installing solar energy systems.

On February 5, 2020, the U.S. Department of Energy announced it would provide $130 million in funding for 55-80 projects in this program. Ten of these projects will receive a total of approximately $7.3 million to focus on machine-learning solutions and other artificial intelligence for solar applications.

Approach

These projects will leverage AI-related know-how developed in the United States to improve the performance and reliability of PV modules, PV plants, and CSP plants; the prediction of solar energy output; and electric-network situational awareness. The project teams will form partnerships with AI experts and industry stakeholders, such as solar power plant operators or owners, electric utilities, PV module manufacturers, and others that can supply the necessary data and solar-related subject matter expertise.

Objectives

Successful projects will enable substantial advancements in solar energy technologies by fully leveraging data sets generated by sensors, sensor networks, and stakeholders. The goal is to develop disruptive solutions across the solar industry value chain.

Selectees

— Award and cost share amounts are subject to change pending negotiations –

Arizona State University (1)

Project Name: Photovoltaic Plant Predictive Maintenance Optimization under Uncertainties Using Probabilistic Information Fusion
Location: Tempe, AZ
DOE Award Amount: $750,000
Awardee Cost Share: $380,000
Principal Investigator: Hao Yan
Project Summary: This project uses artificial intelligence and machine learning methods to develop algorithms that will optimize operation and maintenance of photovoltaic (PV) power plants by detecting and classifying anomalies, predicting failures, and scheduling maintenance activities. Predictive maintenance is important to maintain the long-term financial performance of solar PV plants and reduce downtime. Real-time monitoring data such as power output, temperature, and weather information can be used to identify the common fault class patterns using a hierarchical generative model and probabilistic information fusion framework in the sensor level and system level. This project will use the power plant operated at Arizona State University and Arizona Public Service as the case study to demonstrate the proposed technology for predictive maintenance.

Arizona State University (2)

Project Name: Artificial Intelligence for Robust Integration of AMI and PMU Data to Significantly Boost Renewable Penetration
Location: Tempe, AZ
DOE Award Amount: $750,000
Awardee Cost Share: $190,000
Principal Investigator: Yang Weng
Project Summary: This project uses artificial intelligence and machine learning techniques to combine, synchronize, clean-up, and interpolate electric data from numerous sources in order to more accurately estimate the state of the electric grid. This will ultimately allow for the interconnection and/or operation of more photovoltaic (PV) systems and other distributed energy resources (DER) in power systems while simultaneously enhancing reliability, resiliency and power quality. The research team will innovative measurement synchronization, data mining for bad data detection and identification, robust algorithm design of machine learning for unobservable areas.

Camus Energy

Project Name: Improving Grid Awareness by Empowering Utilities with Machine Learning and Artificial Intelligence
Location: San Francisco, CA
DOE Award Amount: $750,000
Awardee Cost Share: $750,000
Principal Investigator: Cody Smith
Project Summary: This project uses artificial intelligence and machine learning methods to provide grid operators and engineers with real-time analysis and visualization capabilities of the electric power system. Cloud computing approaches to system monitoring and real-time analytics provide a model for leveraging multiple data sources to correlate, verify, and interpret system telemetry in environments with high scale and low data fidelity. Experience from systems design in related fields shows that in sufficiently complex systems, no single data source can be entirely accurate or trustworthy, but an approach that leverages multiple sources and applies intelligent data interpretation can provide an extremely reliable, high-fidelity systems view. This project leverage the team’s past experience with cloud systems monitoring approaches and abundant data for artificial intelligence model training, along with capabilities in integrated power system simulation and monitoring data analytics with machine learning and deep learning to provide advanced, integrated situational awareness for the distribution grid and contributions to area-wide flexibility.

Case Western Reserve University

Project Name: Robust PV Performance Loss Rate Determination and Power Forecasting: Using Spatiotemporal Graph Neural Network Models in a Reliable System-Topology-Aware Learning Framework
Location: Cleveland, OH
DOE Award Amount: $750,000
Awardee Cost Share: $200,000
Principal Investigator: Roger French
Project Summary: This project uses artificial intelligence and machine learning techniques to analyze data from a large number of neighboring photovoltaic (PV) systems in order to extract information about their short- and long-term performance. Machine learning methods will be used to overcome data quality issues affecting individual plants. The development of spatiotemporal graph neural network models will address critical questions of long- and short-term performance for fleets of PV plants. The proposed learning techniques advance both analytical techniques for long-term performance of PV power plants and deep learning techniques, and can mitigate the negative impact of PV plant or sensor failure or unreliable input data.

National Renewable Energy Laboratory

Project Name: Deep-Learning-Powered Probabilistic Net-Load Forecasting with Enhanced Behind-the-Meter PV Visibility
Location: Golden, CO
DOE Award Amount: $750,000
Awardee Cost Share: $230,000
Principal Investigator: Rui Yang
Project Summary: This project uses artificial intelligence and machine learning techniques to predict the electric load one day in advance in areas that have large amounts of behind-the-meter solar. That information will allow operators to manage the electric grid more efficiently and cost-effectively. The deep-learning-powered probabilistic forecasting framework for day-ahead net-load at substations will separate behind-the-meter photovoltaic (PV) generation from net-load measurements and quantify its impact on net-load patterns. A novel transfer learning method will be developed to transfer the knowledge learned from geographic locations with rich sensor data to diverse locations where only the substation measurements are available. The framework will be validated using measurement data from Hawaiian Electric Company and on the Solar Forecast Arbiter platform.

North Carolina State University

Project Name: Day-Ahead Probabilistic Forecasting of Net-Load and Demand Response Potentials with High Penetration of Behind-the-Meter Solar-plus-Storage
Location: Raleigh, NC
DOE Award Amount: $750,000
Awardee Cost Share: $190,000
Principal Investigator: Wenyuan Tang
Project Summary: This project leverages artificial intelligence and machine learning techniques to predict the electric load in areas with large amounts of solar energy and enable more efficient grid operation. The technology will also be able to forecast the capacity available to the grid from electric loads that can be turned on or off depending on the balance between electric demand and generation. Recent advances in artificial intelligence can enhance the accuracy of net-load forecasting, the observability of net-load variability, and the understanding of the coupling between net-load and demand response potentials. The two models under development for addressing hybrid probabilistic forecasting can provide better spatiotemporal information.

Northeastern University

Project Name: Graph-Learning-Assisted State and Event Tracking for Solar-Penetrated Power Grids with Heterogeneous Data Sources
Location: Boston, MA
DOE Award Amount: $750,000
Awardee Cost Share: $420,000
Principal Investigator: Ali Abur
Project Summary: This project uses artificial intelligence and machine learning techniques to integrate electric data and use it to calculate the state of the electric network. The resulting tool will be able to detect connectivity changes and faults in the grid and update grid models accordingly, which will improve the situational awareness of power grids with large amounts of solar energy by exploiting a large volume of data and measurements available from a highly diverse set of sources. The project will also provide tools to detect and identify network topology changes due to unexpected disturbances or switching events by exploiting the recently developed sparse estimation methods in the data analytics area.

Pacific Northwest National Laboratory

Project Name: VRN3P: Variational Recurrent Neural Network Based Net-Load Prediction under High Solar Penetration
Location: Richland, WA
DOE Award Amount: $750,000
Awardee Cost Share: $220,000
Principal Investigator: Soumya Kundu
Project Summary: This project is using artificial intelligence and machine learning techniques to create an open-source tool that can predict the day-ahead electric load in areas with large amounts of behind-the-meter solar and deliver savings in the operation of the electric network. The project team will develop and validate a variational recurrent model-based algorithm for time-series forecasting of net-load under high solar penetration scenarios. Considering the uncertainty of cloud covering, solar irradiance, geographical information, and end-use load, theoretically guaranteed tight bounds on the net-load prediction will be delivered. Comprehensive validation of the proposed variational recurrent model-based net-load prediction algorithm will be performed using real-world industry and utility data.

Stanford University

Project Name: Machine-Learning-Based Mapping and Modeling of Solar Energy with Ultra-High Spatiotemporal Granularity
Location: Redwood City, CA
DOE Award Amount: $500,000
Awardee Cost Share: $250,000
Principal Investigator: Ram Rajagopal
Project Summary: The widespread adoption of rooftop photovoltaic systems and grid-scale solar systems requires a significant change in how system operators, utilities and solar system providers map system adoption, track it is impact, and plan new deployments. Currently available information often lacks crucial details about time and location. The availability of such information would change how the system is planned and managed. This proposal plans to use artificial intelligence (AI) and machine learning (ML) methods to map the deployment of photovoltaic (PV) systems and the distribution network across the country with high accuracy and detail. The final product will be an up-to-date country-wide database that can be made available to researchers or utilities, while protecting private data. The team will utilize advances in AI to combine information available at a large scale – such as satellite imagery, Google street view images, and high-resolution irradiance data from weather stations – to generate the historical location and size of all solar deployments in any given country and reconstruct the transmission or distribution grids as necessary. The project will develop novel machine learning approaches to use this data and address a variety of applications such as identifying bottlenecks, estimating the hosting capacity of distribution systems, locating electric storage, improving wholesale price predictions, and creating more accurate models of consumer adoption.

University of Wisconsin-Madison

Project Name: Learned Productivity Under Variable Solar Conditions
Location: Madison, WI
DOE Award Amount: $750,000
Awardee Cost Share: $190,000
Principal Investigator: Michael Wagner
Project Summary: This project leverages artificial intelligence and machine learning techniques to model a number of concentrating solar thermal power (CSP) plant operations in order to assist human operators in their decisions, especially during variable cloudiness conditions. The machine learning techniques will be applied to extensive, high-resolution, inferred DNI data, cloud profile and vector data, and related solar field thermal collection data in order to develop prescriptive models to optimize solar field collection under variable conditions while minimizing long-term receiver damage and other negative effects. The project will validate the method at an operating CSP facility and publish methodological details for broader use.

Learn more about the SETO 2020 funding program and the project selections in the other topics.

Learn more about SETO’s other competitive awards.

 

Original post: https://www.energy.gov/eere/solar/seto-2020-artificial-intelligence-applications-solar-energy

Leave a Reply

Your email address will not be published. Required fields are marked *