How best to deal with climate change? Data analytics – here’ why – WRAL TechWire

Editor’s note: Tom Snyder, executive director of rapidly growing Raleigh-based RIoT and a thought leader in the emerging Internet of Things, is the newest columnist to join WRAL TechWire’s list of top drawer contributors. “Datafication Nation” premiers today. His columns are of WRAL TechWire’s Startup Monday package.

+++

RALEIGH – It was reported last week by multiple news sources that the US surpassed the previous record for $1B+ environmental incidents in a single year. Twenty-three hurricanes, floods, wildfires, hailstorms, heat waves and other weather events have each cost more than $1B and collectively cost $57.6B in the US alone.

This figure does not include non-weather related natural disasters like earthquakes or volcanic eruptions.

And we still have 109 days left in 2023.

There are numerous geo-political, corporate lobby and partisan politics reasons that regulatory policies have been slow, and often ineffective in combating climate change.  So where else do we look for solutions?

Data Analytics.

The continued rise in computing power, environmental sensor deployment and advances in wireless connectivity gives us hope. The cost to solve massive computational problems like sequencing the human genome or modeling a hurricane, has reduced 7-8 orders of magnitude in the past 20 years. That’s a reduction from hundreds of millions of dollars to a few hundred dollars for many of the world’s most complex analytical challenges.

From ‘digital twins’ to new Microsoft partnership, SAS deepens embrace of AI

‘Digital Twins’ at SAS

We finally have the source data (through sensors) in real-time (through wireless networks) and computing capability (in data centers) to create a digital twin of the weather at a global scale.  Here in RTP, SAS Institute is creating local and planet-scale digital twins of environmental systems for the exact purpose of predicting and eventually preventing climate-related incidents.

Flooding is an area with early traction. I had the pleasure to speak with Tyson Echentile, who leads IoT and Streaming Analytics Business Development at SAS. SAS analytics are showing that historical models of 1,000 year storms are no longer relevant, with such severe weather events trending towards 10 year storms. Future preparedness will be largely dependent on deployment of dynamic digital twins, fed with ground truth data from sensors deployed in our real environment, coupled with simulations run analytically in the cloud.

Here in the Triangle, SAS runs approximately 8,000 simultaneous models of the Cary watershed every 5-6 minutes.  By feeding real-time, factual data into these models continuously as they simulate thousands of “possible” future events, the models are persistently trained and improved.  Over time, the accuracy to predict future weather gets better and we achieve longer advance notice.

Here’s why IBM joins big-name partners in $235M investment of Hugging Face

IBM-NASA partner up

Two weeks, IBM and NASA jointly announced the open-sourcing of more than 250,000 terabytes of geospatial data (satellite imagery) on the open AI platform, Hugging Face. Their goal is to crowdsource environmental research as a means to accelerate mitigations of climate change. This benefits both large companies and aspiring entrepreneurs.

Corey White is the founder of Open Plains, a startup spun out of an NC State research lab developing environmental modeling tools.  Like SAS, Open Plains is working on watershed modeling, with a goal to provide stronger analytics tools to the public sector. His tool allows municipalities to model what-if scenarios without needing specialized data science or software coding skills. Like SAS, his tools are based on real data, increasingly available from IoT and sensor networks.

There are dozens of applications for this kind of predictive analytics.  Vehicles can be automatically re-routed before intersections flood, via integrations with Google Maps and Waze. First responders can be pre-positioned to areas that will be hit hardest. We get more time to evacuate people and protect assets from flash floods. And in the long run, we better inform future land development and infrastructure deployment.

Flooding is just one use case for these environmental digital twins.  Coastal erosion. Wildfire suppression. Tornado prediction. We can make significant advances in all of these areas with the power of continuous analytics applied to digital twins continuously updated with real-time sensor data. Here’s to hoping these methods are non-partisan enough to accelerate solutions to our policy challenges as quickly as they solve climate challenges.

 

Source link

Source: News

Add a Comment

Your email address will not be published. Required fields are marked *