Explainable Climate Science


listen on castbox.fmlisten on google podcastslisten on player.fmlisten on pocketcastslisten on podcast addictlisten on tuninlisten on Amazon Musiclisten on Stitcher

--:--
--:--


2022-01-31

Explainable Climate Science

Zack Labe, a Post-Doctoral Researcher at Colorado State University, joins us today to discuss his work Detecting Climate Signals using Explainable AI with Single Forcing Large Ensembles.

Deep learning approaches have been broadly successful in a wide variety of different fields and applications. It’s no surprise that climate science has leveraged these approaches to create and study models for predicting climate change. If the predictions of such approaches are useful, they should correlate with the intuitions of climatologists that know the Earth system well.

Independent of climate science, layerwise relevance propagation is popular technique that transforms the weights of a neural network to a heatmap over an image, giving some insight into the regions of a particular region which have most influenced the decision of the network.

In this interview, we hear Zack’s experience generating and inspecting these interpretability techniques. The sign of a good model is when it relies on features that are intuitive to a domain expert. Listen to hear how Zack’s expectations align with the generated results.

Zack Labe

I am a postdoctoral researcher in the Department of Atmospheric Science at Colorado State University. I received my Ph.D. from the Department of Earth System Science at the University of California, Irvine in May 2020 and a B.Sc. in Atmospheric Science from Cornell University in May 2015.