Explainable Climate Science


listen on castbox.fmlisten on google podcastslisten on player.fmlisten on pocketcastslisten on podcast addictlisten on tuninlisten on Amazon Musiclisten on Stitcher

--:--
--:--


2022-01-31

Explainable Climate Science

Zack Labe, a Post-Doctoral Researcher at Colorado State University, joins us today to discuss his work Detecting Climate Signals using Explainable AI with Single Forcing Large Ensembles.

Deep learning approaches have been broadly successful in a wide variety of different fields and applications. It’s no surprise that climate science has leveraged these approaches to create and study models for predicting climate change. If the predictions of such approaches are useful, they should correlate with the intuitions of climatologists that know the Earth system well.

Independent of climate science, layerwise relevance propagation is popular technique that transforms the weights of a neural network to a heatmap over an image, giving some insight into the regions of a particular region which have most influenced the decision of the network.

In this interview, we hear Zack’s experience generating and inspecting these interpretability techniques. The sign of a good model is when it relies on features that are intuitive to a domain expert. Listen to hear how Zack’s expectations align with the generated results.

Zack Labe


Thanks to our sponsors for their support

Astrato is a modern BI and analytics platform built for the Snowflake Data Cloud. A next-generation live query data visualization and analytics solution, empowering everyone to make live data decisions.
https://astrato.io