Transforming Weather Forecasting Using AI/ML-Powered Analytics

essidsolutions

The energy debacle that happened in Texas recently impacted energy customers as they suffered astronomical bills while nearly freezing to death. Also, Biden’s climate policy work generated a storm of controversy on how to measure success, necessitating accurate data measurement. Based on this, Chris Kalima, VP of product management at Intertrust Technologies Corporation shares his insights about data interoperability amidst weather emergencies.

Weather control is surely the stuff of superhero comic books and conspiracy theoriesOpens a new window . But there is something well within our control — data. 

The recent “Texas Freeze” natural disaster is a great illustration of the importance of energy data and the need to have reliable, secure methods for data collaboration. Millions of people faced power outages and suffered near freezing, life-threatening indoor conditions during the “Freeze.” While the nature of the Texas energy market combined with natural gas supply and generation contributed to the fiasco, lack of data as well as imperfect data flows played a part as well. The impact of this natural disaster could have been reduced through more efficient energy data management.

Data, Deregulation, and Isolation

Energy data comes from diverse sources, including IoT sensors, weather and climate datasets, and traditional databases. The data is housed in remote systems that typically don’t interoperate. These systems might be owned by separate organizations and subject to different data access rules. Many of these rules are driven by concerns of data loss or theft while moving data for analysis or reporting. Energy data can also contain a lot of sensitive information, requiring adherence to complex regulations.

In Texas’ largely deregulated energy marketOpens a new window , lack of centralized access to data creates obvious challenges. Unlike other electricity markets, Texas’ grid operator, ERCOT, has an “energy-only” market that does not feature capacity markets or other capacity mechanisms. Texas also has one of the lowest energy reserve margins in the country. On top of that, most of Texas is largely isolated from neighboring power systems. Useful historical data points that could have led to better processes — including recommendations stemming from the rolling outages that threatened Texas during Super Bowl XLV in 2011—appearOpens a new window to have been ignored. 

This prefaced the ‘perfect storm’ of events that led to the major outages in February. As the cold front continued, electricity demand spiked. Opens a new window ofollow noopener” title=”Opens a new window” target=”_blank”>natural gas is the primary electricity sourceOpens a new window in the state, but as much as 40% of the natural gas capacity was suddenly unavailable because of the cold. Many power plants suffered outages and ERCOT initiated load shedding to protect the grid from complete collapse. 

Before the major outages, peak demand for the state was projected to be as high as 76 GWOpens a new window . This is almost the same as Texas’ annual peak demand, usually caused by heavy air conditioning usage in the summer. With reserve margins declining to a low near 1 GW, natural gasOpens a new window and electricity prices rose rapidly from the normal rate of $25 per MWh, only stopping at the state-imposed cap of $9,000 per MWhOpens a new window . Unfortunately, the power companies were flying blind as their infrastructure began to collapse and could not keep pace with demand. 

Learn More: Legacy to Cloud Transformation – Are You Invested Enough?

The Need for Data-Driven Decision-Making

The storm’s fallout could have been mitigated by data-driven decision making, specifically through better use of industrial IoT and AI/ML operations — collectively called Asset Performance Management (APM). 

Renewable energy companies rely on APM and the use of predictive algorithms to detect equipment failures, based on past performance, maintenance history, and correlation studies with similar equipment. However, a report by Grid Strategies, LLCOpens a new window found that these predictive solutions are not applied to conventional energy generators. Despite nearly a dozen correlated outage events over the last decade, resource planners still consider conventional generator outages to be fully independent of one another. 

A reliable solution for exchanging energy data information would have had a tremendous positive impact within the Texas energy ecosystem. Effective data communication technologies would have ensured timely messaging and reporting. Sensors attached to natural gas wells could have fired off proactive alerts to indicate falling temperatures. Based on these alerts, stakeholders would have been able to take immediate measures to prevent a shutdown. Data systems could also have been set up to automatically collect data from internal and external data sources, both historical and real-time. Run through AI/ML-powered analytics, this data would have provided powerful insights into potential problems. 

The Data Is Out There

Grid reformsOpens a new window are a great step toward ensuring that the next weather emergency is not an unmitigated disaster. But beyond that, there is a critical need to modernize communications and implement upgraded data management processes and tools. Fortunately, trusted data management and collaboration solutions already exist for utilities, grid operators, and other stakeholders involved with energy-related infrastructure. Similarly, scientists and environmental engineers can already securely integrate climate and weather data into data-driven workflows and use it for weather modeling, risk forecasting, predictive maintenance, and more. The data is out there. Let’s use it. 

Did you enjoy reading this article? Let us know your thoughts in the comment section below or on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We would love to hear from you!