Meteorology has always grappled with the problem of big data. I would even propose that the science was the epitome of big data before the word became mainstream. Due to the multivariate and chaotic nature of weather, for more than half of a century meteorologists have dealt with terabytes of data and modeling variables to produce an accurate forecast. Today, we are still processing data – now on the scale of petabytes – thanks to the Internet of Things, more sensors, and ensemble modeling. Writer Ted Alcorn estimates that, “Today’s (weather) models incorporate about 100 million pieces of data each day, a level of complexity comparable to simulations of the human brain or the birth of the universe.”
But computing power and the advancement of technology such as AI have allowed us to not only analyze the data quicker and easier, but also “learn” from historical data for better situational awareness and decision-making. Within the weather community, AI is being applied to several different challenges. One focus is to make a better weather forecast.
Forecasting is increasingly becoming more accurate. Today a five-day forecast has a 90% accuracy, the same as a three-day forecast 25 years ago. Short-term predictions, or now casting in hourly time spans, is more challenging particularly due to micro changes at the surface. Scientists at DeepMind and the University of Exeter have partnered with the U.K. Met Office to build a nowcasting system using AI that would overcome these challenges to make more accurate short-term predictions, including for critical storms and floods. Another research study is looking at the efficiency of modeling and how AI can analyze past weather patterns to predict future events, more efficiently and more accurately.
My focus of work – and the area of AI that I am particularly interested in – is its application to predict the potential impact from weather events. The outcomes of weather as opposed to the weather itself.
For example, using AI in the utility sector to predict potential outages. Historical outage data is collected on a specific utility location, or region, and allows a computer to generate predictions for future needs based on forecasted weather conditions. It understands how infrastructure has responded to past storms including learning differences in network hardening, realizing the age of individual infrastructure components and maintenance practices. These datasets will yield a baseline of potential outages from upcoming storms. We can apply the same approach with municipalities. Understanding variables such as the city’s infrastructure, topography, and evacuation routes, along with historical weather data, we can help cities have better insight into potential areas of impact and risk of public or infrastructure safety.
And, while we talk about advanced technology and insights, I think it is important to note that the human element is still crucial to the process. A recent Wired article citied studies that found forecasts by human forecasters were more accurate than AI forecasts.
Another area that requires human intervention is the increasing need for risk communicators. These are meteorologists who take the forecast further and convey the risk or impact to a business, municipality or public. I have heard several comments that when AI is more trustworthy it will be as simple as toggling weather preferences to have accurate, meaningful weather data on demand. While I agree that we will have progressively better data and forecasts, I believe this will also increase the need for human experts to evaluate, interpret and communicate the data – and the risk and impact – in a way that makes sense to those who must make nimble, informed decisions to protect people, infrastructure, and businesses assets. The bigger question shouldn’t be human or AI forecasts, but rather how can meteorologists use improved AI to help decision makers make the best decisions for their stakeholders.