Are you a researcher or data scientist / analyst / ninja? Do you want to learn Bayesian inference, stay up to date or simply want to understand what Bayesian inference is?
Then this podcast is for you! You’ll hear from researchers and practitioners of all fields about how they use Bayesian statistics, and how in turn YOU can apply these methods in your modeling workflow.
When I started learning Bayesian methods, I really wished there were a podcast out there that could introduce me to the methods, the projects and the people who make all that possible.
So I created ”Learning Bayesian Statistics”, where you’ll get to hear how Bayesian statistics are used to detect black matter in outer space, forecast elections or understand how diseases spread and can ultimately be stopped.
But this show is not only about successes — it’s also about failures, because that’s how we learn best. So you’ll often hear the guests talking about what *didn’t* work in their projects, why, and how they overcame these challenges. Because, in the end, we’re all lifelong learners!
My name is Alex Andorra by the way, and I live in Estonia. By day, I’m a data scientist and modeler at the PyMC Labs consultancy. By night, I don’t (yet) fight crime, but I’m an open-source enthusiast and core contributor to the python packages PyMC and ArviZ. I also love election forecasting and, most importantly, Nutella. But I don’t like talking about it – I prefer eating it.
So, whether you want to learn Bayesian statistics or hear about the latest libraries, books and applications, this podcast is for you — just subscribe! You can also support the show and unlock exclusive Bayesian swag on Patreon!
The podcast Learning Bayesian Statistics is created by Alexandre Andorra. The podcast and the artwork on this page are embedded on this page using the public podcast feed (RSS).
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
10:11 Understanding Structural Equation Modeling (SEM) and Confirmatory Factor Analysis (CFA)
20:11 Application of SEM and CFA in HR Analytics
30:10 Challenges and Advantages of Bayesian Approaches in SEM and CFA
33:58 Evaluating Bayesian Models
39:50 Challenges in Model Building
44:15 Causal Relationships in SEM and CFA
49:01 Practical Applications of SEM and CFA
51:47 Influence of Philosophy on Data Science
54:51 Designing Models with Confounding in Mind
57:39 Future Trends in Causal Inference
01:00:03 Advice for Aspiring Data Scientists
01:02:48 Future Research Directions
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström and Stefan.
Links from the show:
Transcript:
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
-------------------------
Love the insights from this episode? Make sure you never miss a beat with Chatpods! Whether you're commuting, working out, or just on the go, Chatpods lets you capture and summarize key takeaways effortlessly.
Save time, stay organized, and keep your thoughts at your fingertips.
Download Chatpods directly from App Store or Google Play and use it to listen to this podcast today!
https://www.chatpods.com/?fr=LearningBayesianStatistics
-------------------------
Takeaways:
Chapters:
00:00 Introduction to Bayesian Statistics and Epidemiology
03:35 Guest Backgrounds and Their Journey
10:04 Understanding Computational Biology vs. Epidemiology
16:11 The Role of Bayesian Statistics in Epidemiology
21:40 Recent Projects and Applications in Epidemiology
31:30 Sampling Challenges in Health Surveys
34:22 Model Development and Computational Challenges
36:43 Navigating Different Jargons in Survey Design
39:35 Post-COVID Trends in Epidemiology
42:49 Funding and Data Availability in Epidemiology
45:05 Collaboration Across Disciplines
48:21 Using Neural Networks in Bayesian Modeling
51:42 Model Diagnostics in Epidemiology
55:38 Parameter Estimation in Compartmental Models
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström and Stefan.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Bayesian Statistics and Bob Kubinec
06:01 Bob's Academic Journey and Research Focus
12:40 Measuring Corruption: Challenges and Methods
18:54 Transition from Government to Academia
26:41 The Influence of Non-Traditional Backgrounds in Statistics
34:51 Bayesian Methods in Political Science Research
42:08 Bayesian Methods in COVID Measurement
51:12 The Journey of Writing a Novel
01:00:24 The Intersection of Fiction and Academia
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke, Robert Flannery, Rasmus Hindström and Stefan.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to the Live Episode
02:55 Meet the Stan Core Developers
05:47 Brian Ward's Journey into Bayesian Statistics
09:10 Charles Margossian's Contributions to Stan
11:49 Recent Projects and Innovations in Stan
15:07 User-Friendly Features and Enhancements
18:11 Understanding Tuples and Their Importance
21:06 Challenges for Beginners in Stan
24:08 Pedagogical Approaches to Bayesian Statistics
30:54 Optimizing Monte Carlo Estimators
32:24 Reimagining Stan's Structure
34:21 The Promise of Automatic Reparameterization
35:49 Exploring BridgeStan
40:29 The Future of Samplers in Stan
43:45 Evaluating New Algorithms
47:01 Specific Algorithms for Unique Problems
50:00 Understanding Model Performance
54:21 The Impact of Stan on Bayesian Research
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang, Gary Clarke and Robert Flannery.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Bayesian Experimental Design
07:51 Understanding Bayesian Experimental Design
19:58 Computational Challenges in Bayesian Experimental Design
28:47 Innovations in Bayesian Experimental Design
40:43 Practical Applications of Bayesian Experimental Design
52:12 Future of Bayesian Experimental Design
01:01:17 Real-World Applications and Impact
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang and Gary Clarke.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Ravi and His Role at Seattle Sounders
06:30 Building an Analytics Department
15:00 The Impact of Analytics on Player Recruitment and Performance
28:00 Challenges and Innovations in Soccer Analytics
42:00 Player Health, Injury Prevention, and Training
55:00 The Evolution of Data-Driven Strategies
01:10:00 Future of Analytics in Sports
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan, Francesco Madrisotti, Ivy Huang and Gary Clarke.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Bayesian Modeling in Insurance
13:00 Time Series Models and Their Applications
30:51 Bayesian Model Averaging Explained
56:20 Impact of External Factors on Forecasting
01:25:03 Future of Bayesian Modeling and AI
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 The Role of Nutrition and Conditioning
05:46 Analyzing Player Performance and Managing Injury Risks
12:13 Educating Athletes on Dietary Choices
18:02 Emerging Trends in Baseball Science
29:49 Hierarchical Models and Player Analysis
36:03 Challenges of Working with Limited Data
39:49 Effective Communication of Statistical Concepts
47:59 Future Trends: Biomechanical Data Analysis and Computer Vision Algorithms
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript:
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Bayesian Statistics
07:32 Advantages of Bayesian Methods
16:22 Incorporating Priors in Models
23:26 Modeling Causal Relationships
30:03 Introduction to PyMC, Stan, and Bambi
34:30 Choosing the Right Bayesian Framework
39:20 Getting Started with Bayesian Statistics
44:39 Understanding Bayesian Statistics and PyMC
49:01 Leveraging PyTensor for Improved Performance and Scalability
01:02:37 Exploring Post-Modeling Workflows with ArviZ
01:08:30 The Power of Gaussian Processes in Bayesian Modeling
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
05:36 Tomi's Work and Teaching
10:28 Teaching Complex Statistical Concepts with Practical Exercises
23:17 Making Bayesian Modeling Accessible in Python
38:46 Advanced Regression with Bambi
41:14 The Power of Linear Regression
42:45 Exploring Advanced Regression Techniques
44:11 Regression Models and Dot Products
45:37 Advanced Concepts in Regression
46:36 Diagnosing and Handling Overdispersion
47:35 Parameter Identifiability and Overparameterization
50:29 Visualizations and Course Highlights
51:30 Exploring Niche and Advanced Concepts
56:56 The Power of Zero-Sum Normal
59:59 The Value of Exercises and Community
01:01:56 Optimizing Computation with Sparse Matrices
01:13:37 Avoiding MCMC and Exploring Alternatives
01:18:27 Making Connections Between Different Models
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Bayesian Statistics in Sports Analytics
18:29 Applying Bayesian Stats in Analyzing Player Performance and Injury Risk
36:21 Challenges in Communicating Bayesian Concepts to Non-Statistical Decision-Makers
41:04 Understanding Model Behavior and Validation through Simulations
43:09 Applying Bayesian Methods in Sports Analytics
48:03 Clarifying Questions and Utilizing Frameworks
53:41 Effective Communication of Statistical Concepts
57:50 Integrating Domain Expertise with Statistical Models
01:13:43 The Importance of Good Data
01:18:11 The Future of Sports Analytics
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript:
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Large-Scale Machine Learning
11:26 Scalable and Flexible Bayesian Inference with Posteriors
25:56 The Role of Temperature in Bayesian Models
32:30 Stochastic Gradient MCMC for Large Datasets
36:12 Introducing Posteriors: Bayesian Inference in Machine Learning
41:22 Uncertainty Quantification and Improved Predictions
52:05 Supporting New Algorithms and Arbitrary Likelihoods
59:16 Thermodynamic Computing
01:06:22 Decoupling Model Specification, Data Generation, and Inference
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work !
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways
Chapters
00:00 The Power and Importance of Priors
09:29 Updating Beliefs and Choosing Reasonable Priors
16:08 Assessing Robustness with Prior Sensitivity Analysis
34:53 Aligning Bayesian Methods with Researchers' Thinking
37:10 Detecting Overfitting in SEM
43:48 Evaluating Model Fit with Posterior Predictive Checks
47:44 Teaching Bayesian Methods
54:07 Future Developments in Bayesian Statistics
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!
Visit our Patreon page to unlock exclusive Bayesian swag ;)
Takeaways
Chapters
00:00 Introduction and Overview
09:27 The Power of Bayesian Analysis in Sports Modeling
16:28 The Revolution of Massive Data Sets in Sports Analytics
31:03 The Impact of Budget in Sports Analytics
39:35 Introduction to Sports Analytics
52:22 Plus-Minus Models in American Football
01:04:11 The Future of Sports Analytics
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.
Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks.
He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.
A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy.
In his free time, Marvin enjoys board games and is a passionate guitar player.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
Chapters:
00:00 Introduction to Amortized Bayesian Inference
07:39 Bayesian Neural Networks
11:47 Amortized Bayesian Inference and Posterior Inference
23:20 BayesFlow: A Python Library for Amortized Bayesian Workflows
38:15 Self-consistency loss: Bridging Simulation-Based Inference and Likelihood-Based Bayesian Inference
41:35 Amortized Bayesian Inference
43:53 Fusing Multiple Sources of Information
45:19 Compensating for Missing Data
56:17 Emerging Topics: Expressive Generative Models and Foundation Models
01:06:18 The Future of Deep Learning and Probabilistic Machine Learning
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
If there is one guest I don’t need to introduce, it’s mister Andrew Gelman. So… I won’t! I will refer you back to his two previous appearances on the show though, because learning from Andrew is always a pleasure. So go ahead and listen to episodes 20 and 27.
In this episode, Andrew and I discuss his new book, Active Statistics, which focuses on teaching and learning statistics through active student participation. Like this episode, the book is divided into three parts: 1) The ideas of statistics, regression, and causal inference; 2) The value of storytelling to make statistical concepts more relatable and interesting; 3) The importance of teaching statistics in an active learning environment, where students are engaged in problem-solving and discussion.
And Andrew is so active and knowledgeable that we of course touched on a variety of other topics — but for that, you’ll have to listen ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Active learning is essential for teaching and learning statistics.
- Storytelling can make statistical concepts more relatable and interesting.
- Teaching statistics in an active learning environment engages students in problem-solving and discussion.
- The book Active Statistics includes 52 stories, class participation activities, computer demonstrations, and homework assignments to facilitate active learning.
- Active learning, where students actively engage with the material through activities and discussions, is an effective approach to teaching statistics.
- The flipped classroom model, where students read and prepare before class and engage in problem-solving activities during class, can enhance learning and understanding.
- Clear organization and fluency in teaching statistics are important for student comprehension and engagement.
- Visualization plays a crucial role in understanding statistical concepts and aids in comprehension.
- The future of statistical education may involve new approaches and technologies, but the challenge lies in finding effective ways to teach basic concepts and make them relevant to real-world problems.
Chapters:
00:00 Introduction and Background
08:09 The Importance of Stories in Statistics Education
30:28 Using 'Two Truths and a Lie' to Teach Logistic Regression
38:08 The Power of Storytelling in Teaching Statistics
57:26 The Importance of Visualization in Understanding Statistics
01:07:03 The Future of Statistical Education
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, Andy Aschwanden and Doug Brinkerhoff tell us about their work in glaciology and the application of Bayesian statistics in studying glaciers. They discuss the use of computer models and data analysis in understanding glacier behavior and predicting sea level rise, and a lot of other fascinating topics.
Andy grew up in the Swiss Alps, and studied Earth Sciences, with a focus on atmospheric and climate science and glaciology. After his PhD, Andy moved to Fairbanks, Alaska, and became involved with the Parallel Ice Sheet Model, the first open-source and openly-developed ice sheet model.
His first PhD student was no other than… Doug Brinkerhoff! Doug did an MS in computer science at the University of Montana, focusing on numerical methods for ice sheet modeling, and then moved to Fairbanks to complete his PhD. While in Fairbanks, he became an ardent Bayesian after “seeing that uncertainty needs to be embraced rather than ignored”. Doug has since moved back to Montana, becoming faculty in the University of Montana’s computer science department.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero and Will Geary.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Computer models and data analysis play a crucial role in understanding glacier behavior and predicting sea level rise.
- Reliable data, especially on ice thickness and climate forcing, are essential for accurate modeling.
- The collaboration between glaciology and Bayesian statistics has led to breakthroughs in understanding glacier evolution forecasts.
-There is a need for open-source packages and tools to make glaciological models more accessible. Glaciology and ice sheet modeling are complex fields that require collaboration between domain experts and data scientists.
- The use of Bayesian statistics in glaciology allows for a probabilistic framework to understand and communicate uncertainty in predictions.
- Real-time forecasting of glacier behavior is an exciting area of research that could provide valuable information for communities living near glaciers.
-There is a need for further research in understanding existing data sets and developing simpler methods to analyze them.
- The future of glaciology research lies in studying Alaskan glaciers and understanding the challenges posed by the changing Arctic environment.
Chapters:
00:00 Introduction and Background
08:54 The Role of Statistics in Glaciology
31:46 Open-Source Packages and Tools
52:06 The Power of Bayesian Statistics in Glaciology
01:06:34 Understanding Existing Data Sets and Developing Simpler Methods
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
GPs are extremely powerful…. but hard to handle. One of the bottlenecks is learning the appropriate kernel. What if you could learn the structure of GP kernels automatically? Sounds really cool, but also a bit futuristic, doesn’t it?
Well, think again, because in this episode, Feras Saad will teach us how to do just that! Feras is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. He received his PhD in Computer Science from MIT, and, most importantly for our conversation, he’s the creator of AutoGP.jl, a Julia package for automatic Gaussian process modeling.
Feras discusses the implementation of AutoGP, how it scales, what you can do with it, and how you can integrate its outputs in your models.
Finally, Feras provides an overview of Sequential Monte Carlo and its usefulness in AutoGP, highlighting the ability of SMC to incorporate new data in a streaming fashion and explore multiple modes efficiently.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell and Gal Kampel.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- AutoGP is a Julia package for automatic Gaussian process modeling that learns the structure of GP kernels automatically.
- It addresses the challenge of making structural choices for covariance functions by using a symbolic language and a recursive grammar to infer the expression of the covariance function given the observed data.
-AutoGP incorporates sequential Monte Carlo inference to handle scalability and uncertainty in structure learning.
- The package is implemented in Julia using the Gen probabilistic programming language, which provides support for sequential Monte Carlo and involutive MCMC.
- Sequential Monte Carlo (SMC) and inductive MCMC are used in AutoGP to infer the structure of the model.
- Integrating probabilistic models with language models can improve interpretability and trustworthiness in data-driven inferences.
- Challenges in Bayesian workflows include the need for automated model discovery and scalability of inference algorithms.
- Future developments in probabilistic reasoning systems include unifying people around data-driven inferences and improving the scalability and configurability of inference algorithms.
Chapters:
00:00 Introduction to AutoGP
26:28 Automatic Gaussian Process Modeling
45:05 AutoGP: Automatic Discovery of Gaussian Process Model Structure
53:39 Applying AutoGP to New Settings
01:09:27 The Biggest Hurdle in the Bayesian Workflow
01:19:14 Unifying People Around Data-Driven Inferences
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Changing perspective is often a great way to solve burning research problems. Riemannian spaces are such a perspective change, as Arto Klami, an Associate Professor of computer science at the University of Helsinki and member of the Finnish Center for Artificial Intelligence, will tell us in this episode.
He explains the concept of Riemannian spaces, their application in inference algorithms, how they can help sampling Bayesian models, and their similarity with normalizing flows, that we discussed in episode 98.
Arto also introduces PreliZ, a tool for prior elicitation, and highlights its benefits in simplifying the process of setting priors, thus improving the accuracy of our models.
When Arto is not solving mathematical equations, you’ll find him cycling, or around a good board game.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Riemannian spaces offer a way to improve computational efficiency and accuracy in Bayesian inference by considering the curvature of the posterior distribution.
- Riemannian spaces can be used in Laplace approximation and Markov chain Monte Carlo algorithms to better model the posterior distribution and explore challenging areas of the parameter space.
- Normalizing flows are a complementary approach to Riemannian spaces, using non-linear transformations to warp the parameter space and improve sampling efficiency.
- Evaluating the performance of Bayesian inference algorithms in challenging cases is a current research challenge, and more work is needed to establish benchmarks and compare different methods.
- PreliZ is a package for prior elicitation in Bayesian modeling that facilitates communication with users through visualizations of predictive and parameter distributions.
- Careful prior specification is important, and tools like PreliZ make the process easier and more reproducible.
- Teaching Bayesian machine learning is challenging due to the combination of statistical and programming concepts, but it is possible to teach the basic reasoning behind Bayesian methods to a diverse group of students.
- The integration of Bayesian approaches in data science workflows is becoming more accepted, especially in industries that already use deep learning techniques.
- The future of Bayesian methods in AI research may involve the development of AI assistants for Bayesian modeling and probabilistic reasoning.
Chapters:
00:00 Introduction and Background
02:05 Arto's Work and Background
06:05 Introduction to Bayesian Inference
12:46 Riemannian Spaces in Bayesian Inference
27:24 Availability of Romanian-based Algorithms
30:20 Practical Applications and Evaluation
37:33 Introduction to Prelease
38:03 Prior Elicitation
39:01 Predictive Elicitation Techniques
39:30 PreliZ: Interface with Users
40:27 PreliZ: General Purpose Tool
41:55 Getting Started with PreliZ
42:45 Challenges of Setting Priors
45:10 Reproducibility and Transparency in Priors
46:07 Integration of Bayesian Approaches in Data Science Workflows
55:11 Teaching Bayesian Machine Learning
01:06:13 The Future of Bayesian Methods with AI Research
01:10:16 Solving the Prior Elicitation Problem
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Structural Equation Modeling (SEM) is a key framework in causal inference. As I’m diving deeper and deeper into these topics to teach them and, well, finally understand them, I was delighted to host Ed Merkle on the show.
A professor of psychological sciences at the University of Missouri, Ed discusses his work on Bayesian applications to psychometric models and model estimation, particularly in the context of Bayesian SEM. He explains the importance of BSEM in psychometrics and the challenges encountered in its estimation.
Ed also introduces his blavaan package in R, which enhances researchers' capabilities in BSEM and has been instrumental in the dissemination of these methods. Additionally, he explores the role of Bayesian methods in forecasting and crowdsourcing wisdom.
When he’s not thinking about stats and psychology, Ed can be found running, playing the piano, or playing 8-bit video games.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Bayesian SEM is a powerful framework in psychometrics that allows for the estimation of complex models involving multiple variables and causal relationships.
- Understanding the principles of Bayesian inference is crucial for effectively applying Bayesian SEM in psychological research.
- Informative priors play a key role in Bayesian modeling, providing valuable information and improving the accuracy of model estimates.
- Challenges in BSEM estimation include specifying appropriate prior distributions, dealing with unidentified parameters, and ensuring convergence of the model. Incorporating prior information is crucial in Bayesian modeling, especially when dealing with large models and imperfect data.
- The blavaan package enhances researchers' capabilities in Bayesian structural equation modeling, providing a user-friendly interface and compatibility with existing frequentist models.
- Bayesian methods offer advantages in forecasting and subjective probability by allowing for the characterization of uncertainty and providing a range of predictions.
- Interpreting Bayesian model results requires careful consideration of the entire posterior distribution, rather than focusing solely on point estimates.
- Latent variable models, also known as structural equation models, play a crucial role in psychometrics, allowing for the estimation of unobserved variables and their influence on observed variables.
- The speed of MCMC estimation and the need for a slower, more thoughtful workflow are common challenges in the Bayesian workflow.
- The future of Bayesian psychometrics may involve advancements in parallel computing and GPU-accelerated MCMC algorithms.
Chapters:
00:00 Introduction to the Conversation
02:17 Background and Work on Bayesian SEM
04:12 Topics of Focus: Structural Equation Models
05:16 Introduction to Bayesian Inference
09:30 Importance of Bayesian SEM in Psychometrics
10:28 Overview of Bayesian Structural Equation Modeling (BSEM)
12:22 Relationship between BSEM and Causal Inference
15:41 Advice for Learning BSEM
21:57 Challenges in BSEM Estimation
34:40 The Impact of Model Size and Data Quality
37:07 The Development of the Blavaan Package
42:16 Bayesian Methods in Forecasting and Subjective Probability
46:27 Interpreting Bayesian Model Results
51:13 Latent Variable Models in Psychometrics
56:23 Challenges in the Bayesian Workflow
01:01:13 The Future of Bayesian Psychometrics
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/
Watch the interview: https://www.youtube.com/watch?v=ZaZwCcrJlik
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/
Watch the interview: https://www.youtube.com/watch?v=ZaZwCcrJlik
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, we dive deep into gravitational wave astronomy, with Christopher Berry and John Veitch, two senior lecturers at the University of Glasgow and experts from the LIGO-VIRGO collaboration. They explain the significance of detecting gravitational waves, which are essential for understanding black holes and neutron stars collisions. This research not only sheds light on these distant events but also helps us grasp the fundamental workings of the universe.
Our discussion focuses on the integral role of Bayesian statistics, detailing how they use nested sampling for extracting crucial information from the subtle signals of gravitational waves. This approach is vital for parameter estimation and understanding the distribution of cosmic sources through population inferences.
Concluding the episode, Christopher and John highlight the latest advancements in black hole astrophysics and tests of general relativity, and touch upon the exciting prospects and challenges of the upcoming space-based LISA mission.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
⁃ Gravitational wave analysis involves using Bayesian statistics for parameter estimation and population inference.
⁃ Nested sampling is a powerful algorithm used in gravitational wave analysis to explore parameter space and calculate the evidence for model selection.
⁃ Machine learning techniques, such as normalizing flows, can be integrated with nested sampling to improve efficiency and explore complex distributions.
⁃ The LIGO-VIRGO collaboration operates gravitational wave detectors that measure distortions in space and time caused by black hole and neutron star collisions.
⁃ Sources of noise in gravitational wave detection include laser noise, thermal noise, seismic motion, and gravitational coupling.
⁃ The LISA mission is a space-based gravitational wave detector that aims to observe lower frequency gravitational waves and unlock new astrophysical phenomena.
⁃ Space-based detectors like LISA can avoid the ground-based noise and observe a different part of the gravitational wave spectrum, providing new insights into the universe.
⁃ The data analysis challenges for space-based detectors are complex, as they require fitting multiple sources simultaneously and dealing with overlapping signals.
⁃ Gravitational wave observations have the potential to test general relativity, study the astrophysics of black holes and neutron stars, and provide insights into cosmology.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/100-reactive-message-passing-automated-inference-in-julia-dmitry-bagaev/
Watch the interview: https://www.youtube.com/watch?v=ZG3H0xxCXTQ
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/100-reactive-message-passing-automated-inference-in-julia-dmitry-bagaev/
Watch the interview: https://www.youtube.com/watch?v=ZG3H0xxCXTQ
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, Dmitry Bagaev discusses his work in Bayesian statistics and the development of RxInfer.jl, a reactive message passing toolbox for Bayesian inference.
Dmitry explains the concept of reactive message passing and its applications in real-time signal processing and autonomous systems. He discusses the challenges and benefits of using RxInfer.jl, including its scalability and efficiency in large probabilistic models.
Dmitry also shares insights into the trade-offs involved in Bayesian inference architecture and the role of variational inference in RxInfer.jl. Additionally, he discusses his startup Lazy Dynamics and its goal of commercializing research in Bayesian inference.
Finally, we also discuss the user-friendliness and trade-offs of different inference methods, the future developments of RxInfer, and the future of automated Bayesian inference.
Coming from a very small town in Russia called Nizhnekamsk, Dmitry currently lives in the Netherlands, where he did his PhD. Before that, he graduated from the Computational Science and Modeling department of Moscow State University.
Beyond that, Dmitry is also a drummer (you’ll see his cool drums if you’re watching on YouTube), and an adept of extreme sports, like skydiving, wakeboarding and skiing!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
- Reactive message passing is a powerful approach to Bayesian inference that allows for real-time updates and adaptivity in probabilistic models.
- RxInfer.jl is a toolbox for reactive message passing in Bayesian inference, designed to be scalable, efficient, and adaptable.
- Julia is a preferred language for RxInfer.jl due to its speed, macros, and multiple dispatch, which enable efficient and flexible implementation.
- Variational inference plays a crucial role in RxInfer.jl, allowing for trade-offs between computational complexity and accuracy in Bayesian inference.
- Lazy Dynamics is a startup focused on commercializing research in Bayesian inference, with the goal of making RxInfer.jl accessible and robust for industry applications.
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/99-exploring-quantum-physics-bayesian-stats-chris-ferrie/
Watch the interview: https://www.youtube.com/watch?v=pRaT6FLF7A8
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/99-exploring-quantum-physics-bayesian-stats-chris-ferrie/
Watch the interview: https://www.youtube.com/watch?v=pRaT6FLF7A8
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
You know I’m a big fan of everything physics. So when I heard that Bayesian stats was especially useful in quantum physics, I had to make an episode about it!
You’ll hear from Chris Ferrie, an Associate Professor at the Centre for Quantum Software and Information of the University of Technology Sydney. Chris also has a foot in industry, as a co-founder of Eigensystems, an Australian start-up with a mission to democratize access to quantum computing.
Of course, we talked about why Bayesian stats are helpful in quantum physics research, and about the burning challenges in this line of research.
But Chris is also a renowned author — in addition to writing Bayesian Probability for Babies, he is the author of Quantum Physics for Babies and Quantum Bullsh*t: How to Ruin Your Life With Advice from Quantum Physics. So we ended up talking about science communication, science education, and a shocking revelation about Ant Man…
A big thank you to one of my best Patrons, Stefan Lorenz, for recommending me an episode with Chris!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways:
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/
Watch the interview: https://www.youtube.com/watch?v=vVqZ0WWXX7g
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/
Watch the interview: https://www.youtube.com/watch?v=vVqZ0WWXX7g
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
How does the world of statistical physics intertwine with machine learning, and what groundbreaking insights can this fusion bring to the field of artificial intelligence?
In this episode, we delve into these intriguing questions with Marylou Gabrié. an assistant professor at CMAP, Ecole Polytechnique in Paris. Having completed her PhD in physics at École Normale Supérieure, Marylou ventured to New York City for a joint postdoctoral appointment at New York University’s Center for Data Science and the Flatiron’s Center for Computational Mathematics.
As you’ll hear, her research is not just about theoretical exploration; it also extends to the practical adaptation of machine learning techniques in scientific contexts, particularly where data is scarce.
In this conversation, we’ll traverse the landscape of Marylou's research, discussing her recent publications and her innovative approaches to machine learning challenges, latest MCMC advances, and ML-assisted scientific computing.
Beyond that, get ready to discover the person behind the science – her inspirations, aspirations, and maybe even what she does when not decoding the complexities of machine learning algorithms!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Takeaways
Links from the show:
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/97-probably-overthinking-statistical-paradoxes-allen-downey/
Watch the interview: https://www.youtube.com/watch?v=KgesIe3hTe0
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/97-probably-overthinking-statistical-paradoxes-allen-downey/
Watch the interview: https://www.youtube.com/watch?v=KgesIe3hTe0
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, I had the pleasure of speaking with Allen Downey, a professor emeritus at Olin College and a curriculum designer at Brilliant.org. Allen is a renowned author in the fields of programming and data science, with books such as "Think Python" and "Think Bayes" to his credit. He also authors the blog "Probably Overthinking It" and has a new book by the same name, which he just released in December 2023.
In this conversation, we tried to help you differentiate between right and wrong ways of looking at statistical data, discussed the Overton paradox and the role of Bayesian thinking in it, and detailed a mysterious Bayesian killer app!
But that’s not all: we even addressed the claim that Bayesian and frequentist methods often yield the same results — and why it’s a false claim. If that doesn’t get you to listen, I don’t know what will!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie and Cory Kiser.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
We are happy to welcome Allen Downey back to ur show and he has great news for us: His new book “Probably Overthinking It” is available now.
You might know Allen from his blog by the same name or his previous work. Or maybe you watched some of his educational videos which he produces in his new position at brilliant.org.
We delve right into exciting topics like collider bias and how it can explain the “low brith weight paradox” and other situations that only seem paradoxical at first, until you apply causal thinking to it.
Another classic Allen can unmystify for us is Simpson’s paradox. The problem is not the data, but your expectations of the data. We talk about some cases of Simpson’s paradox, for example from statistics on the Covid-19 pandemic, also featured in his book.
We also cover the “Overton paradox” - which Allen named himself - on how people report their ideologies as liberal or conservative over time.
Next to casual thinking and statistical paradoxes, we return to the common claim that frequentist statistics and Bayesian statistics often give the same results. Allen explains that they are fundamentally different and that Bayesian should not shy away from pointing that out and to emphasise the strengths of their methods.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/96-pharma-models-sports-analytics-stan-news-daniel-lee/
Watch the interview: https://www.youtube.com/watch?v=lnq5ZPlup0E
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas and Luke Gorrie.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Listen to the full episode: https://learnbayesstats.com/episode/96-pharma-models-sports-analytics-stan-news-daniel-lee/
Watch the interview: https://www.youtube.com/watch?v=lnq5ZPlup0E
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas and Luke Gorrie.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Getting Daniel Lee on the show is a real treat — with 20 years of experience in numeric computation; 10 years creating and working with Stan; 5 years working on pharma-related models, you can ask him virtually anything. And that I did…
From joint models for estimating oncology treatment efficacy to PK/PD models; from data fusion for U.S. Navy applications to baseball and football analytics, as well as common misconceptions or challenges in the Bayesian world — our conversation spans a wide range of topics that I’m sure you’ll appreciate!
Daniel studied Mathematics at MIT and Statistics at Cambridge University, and, when he’s not in front of his computer, is a savvy basketball player and… a hip hop DJ — you actually have his SoundCloud profile in the show notes if you’re curious!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas and Luke Gorrie.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Our guest this week, Daniel Lee, is a real Bayesian allrounder and will give us new insights into a lot of Bayesian applications.
Daniel got introduced to Bayesian stats when trying to estimate the failure rate of satellite dishes as an undergraduate student. He was lucky to be mentored by Bayesian greats like David Spiegelhalter, Andrew Gelman and Bob Carpenter. He also sat in on reading groups at universities where he learned about cutting edge developments - something he would recommend anyone to really dive deep into the matter.
He used all this experience working on Pk/Pd (Pharmacokinetics/ Pharmacodynamics) models. We talk about the challenges in understanding individual responses to drugs based on the speed with which they move through the body. Bayesian statistics allows for incorporating more complexity into those models for more accurate estimation.
Daniel also worked on decision making and information fusing problems for the military, such as identifying a plane as friend or foe through the radar of several ships.
And to add even more diversity to his repertoire, Daniel now also works in the world of sports analytics, another popular topic on our show. We talk about the state of this emerging field and its challenges.
Finally, we cover some STAN news, discuss common problems and misconceptions around Bayesian statistics and how to resolve them.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Welcome to another installment of our LBS physics deep dive! After exploring the world of experimental physics at CERN in our first video documentary in episode 93, we’ll stay in Geneva for this one, but this time we’ll dive into theoretical physics.
We’ll explore mysterious components of the universe, like dark matter and dark energy. We’ll also see how the study of gravity intersects with the study of particle physics, especially when considering black holes and the early universe. Even crazier, we’ll see that there are actual experiments and observational projects going on to answer these fundamental questions!
Our guide for this episode is Valerie Domcke, permanent research staff member at CERN, who did her PhD in Hamburg, Germany, and postdocs in Trieste and Paris.
When she’s not trying to decipher the mysteries of the universe, Valerie can be found on boats, as she’s a big sailing fan.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls and Maksim Kuznecov.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Episode 95 is another instalment of our Deep Dive into Physics series. And this time we move away from the empirical side of this topic towards more theoretical questions.
There is no one better for this topic than Dr. Valerie Domcke. Valerie is the second researcher from the CERN we have on our show. She is located at the Department of Theoretical Physics there.
We mainly focus on the Standard Model of Physics, where it fails to explain observations, what proposals are discussed to update or replace it and what kind of evidence would be needed to make such a decision.
Valerie is particularly interested in situations in which the Standard Model brakes down, such as when trying to explain the excess gravitational pull observed that cannot be accounted for by visible stars.
Of course, we cover fascinating topics like dark matter, dark energy, black holes and gravitational waves that are places to look for evidence against the Standard Model.
Looking more at the practical side of things, we discuss the challenges in disentangling signal from noise, especially in such complex fields as astro- and quantum-physics.
We also touch upon the challenges Valerie is currently tackling in working on a new observatory for gravitational waves, the Laser Interferometer Space Antenna, LISA.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, Jonathan Templin, Professor of Psychological and Quantitative Foundations at the University of Iowa, shares insights into his journey in the world of psychometrics.
Jonathan’s research focuses on diagnostic classification models — psychometric models that seek to provide multiple reliable scores from educational and psychological assessments. He also studies Bayesian statistics, as applied in psychometrics, broadly. So, naturally, we discuss the significance of psychometrics in psychological sciences, and how Bayesian methods are helpful in this field.
We also talk about challenges in choosing appropriate prior distributions, best practices for model comparison, and how you can use the Multivariate Normal distribution to infer the correlations between the predictors of your linear regressions.
This is a deep-reaching conversation that concludes with the future of Bayesian statistics in psychological, educational, and social sciences — hope you’ll enjoy it!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca and Dante Gates.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
You have probably unknowingly already been exposed to this episode’s topic - psychometric testing - when taking a test at school or university. Our guest, Professor Jonathan Templin, tries to increase the meaningfulness of these tests by improving the underlying psychometric models, the bayesian way of course!
Jonathan explains that it is not easy to judge the ability of a student based on exams since they have errors and are only a snapshot. Bayesian statistics helps by naturally propagating this uncertainty to the results.
In the field of psychometric testing, Marginal Maximum Likelihood is commonly used. This approach quickly becomes unfeasible though when trying to marginalise over multidimensional test scores. Luckily, Bayesian probabilistic sampling does not suffer from this.
A further reason to prefer Bayesian statistics is that it provides a lot of information in the posterior. Imagine taking a test that tells you what profession you should pursue at the end of high school. The field with the best fit is of course interesting, but the second best fit may be as well. The posterior distribution can provide this kind of information.
After becoming convinced that Bayes is the right choice for psychometrics, we also talk about practical challenges like choosing a prior for the covariance in a multivariate normal distribution, model selection procedures and more.
In the end we learn about a great Bayesian holiday destination, so make sure to listen till the end!
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
This is a very special episode. It is the first-ever LBS video episode, and it takes place in the heart of particle physics research -- the CERN 🍾
I went onsite in Geneva, to visit Kevin Greif, a doctoral candidate in particle physics at UC Irvine, and we walked around the CERN campus, talking about particle physics, dark matter, dark energy, machine learning -- and a lot more!
I still released the audio form of this episode, but I really thought and made it as a video-first episode, so I strongly recommend watching this one, as you’ll get a cool tour of the CERN campus and some of its experiments ;) I put the YouTube link in the show notes.
I hope you'll enjoy this deep dive into all things physics. If you have any recommendations for other cool scientific places I should do a documentary about, please get in touch on Twitter @LearnBayesStats, or by email.
This was literally a one-person endeavor — you may have noticed that I edited the video myself. So, if you liked it, please send this episode to your friends and colleagues -- and tell them to support the show on Patreon 😉
With enough support, that means I'll be able to continue with such in-depth content, and maybe, maybe, even pay for a professional video editor next time 🙈
Enjoy, my dear Bayesians, and best Bayesian wishes 🖖
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau and Luis Fonseca.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Some physics (and physics adjacent) books Kevin enjoys:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I love Bayesian modeling. Not only because it allows me to model interesting phenomena and learn about the world I live in. But because it’s part of a broader epistemological framework that confronts me with deep questions — how do you make decisions under uncertainty? How do you communicate risk and uncertainty? What does being rational even mean?
Thankfully, Gerd Gigerenzer is there to help us navigate these fascinating topics. Gerd is the Director of the Harding Center for Risk Literacy of the University of Potsdam, Germany.
Also Director emeritus at the Max Planck Institute for Human Development, he is a former Professor of Psychology at the University of Chicago and Distinguished Visiting Professor at the School of Law of the University of Virginia.
Gerd has written numerous awarded articles and books, including Risk Savvy, Simple Heuristics That Make Us Smart, Rationality for Mortals, and How to Stay Smart in a Smart World.
As you’ll hear, Gerd has trained U.S. federal judges, German physicians, and top managers to make better decisions under uncertainty.
But Gerd is also a banjo player, has won a medal in Judo, and loves scuba diving, skiing, and, above all, reading.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau and Luis Fonseca.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In this episode, we have no other than Gerd Gigerenzer on the show, an expert in decision making, rationality and communicating risk and probabilities.
Gerd is a trained psychologist and worked at a number of distinguished institutes like the Max Planck Institute for Human Development in Berlin or the University of Chicago. He is director of the Harding Center for Risk Literacy in Potsdam.
One of his many topics of study are heuristics, a term often misunderstood, as he explains. We talk about the role of heuristics in a world of uncertainty, how it interacts with analysis and how it relates to intuition.
Another major topic of his work and this episode are natural frequencies and how they are a more natural way than conditional probabilities to express information such as the probability of having cancer after a positive screening.
Gerd studied the usefulness of natural frequencies in practice and contributed to them being taught in high school in Bavaria, Germany, as an important tool to navigate the real world.
In general, Gerd is passionate about not only researching these topics but also seeing them applied outside of academia. He taught thousands of medical doctors how to understand and communicate statistics and also worked on a number of economical decision making scenarios.
In the end we discuss the benefits of simpler models for complex, uncertain situations, as for example in the case of predicting flu seasons.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
As you may know, I’m kind of a nerd. And I also love football — I've been a PSG fan since I’m 5 years old, so I’ve lived it all with this club.. And yet, I’ve never done a European-centered football analytics episode because, well, the US are much more advanced when it comes to sports analytics.
But today, I’m happy to say this day has come: a sports analytics episode where we can actually talk about European football. And that is thanks to Maximilan Göbel.
Max is a post-doctoral researcher in Economics and Finance at Bocconi University in Milan. Before that, he did his PhD in Economics at the Lisbon School of Economics and Management.
Max is a very passionate football fan and played himself for almost 25 years in his local football club. Unfortunately, he had to give it up when starting his PhD — don’t worry, he still goes to the gym, or goes running and sometimes cycling.
Max is also a great cook, inspired by all kinds of Italian food, and an avid podcast listener — from financial news, to health and fitness content, and even a mysterious and entertaining Bayesian podcast…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau and Luis Fonseca.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Max's paper using Bayesian inference:
Forecasting Arctic Sea Ice:
Some of Max’s coauthors:
Abstract
We already covered baseball analytics in the U.S.A. with Jim Albert in episode 85 and looked back at the decade long history of sports analytics there. How does it look like in Europe?
To talk about this we got Max Göbel on the show. Max is a post-doctoral researcher in Economics and Finance at Bocconi University in Milan and holds a PhD in Economics from the Lisbon School of Economics and Management.
What qualifies him to talk about the sports-side of sports analytics is his passion for football and decades of playing experience.
So, can sports analytics in Europe compete with analytics in the U.S.A.? Unfortunately, not yet. Many sports clubs do not use models in their hiring decisions, leading to suboptimal choices based on players’ reputation alone, as Max explains.
He designed a factor model for the performance of single players, borrowing from his econometrics expertise (check it out on his webpage, link in the show notes).
We talk about how to grow this model from a simple and straight-forward Bernoulli model for the rate of scored goals to a multilevel model, incorporating other players. And of course, we discuss the benefits for using Bayesian statistics for this modelling problem.
We also cover sport analytics more generally and why it may not be so widely used in European football clubs yet.
Besides his interest in football analytics, Max worked and works on topics in econometrics such as regression forecasting in the U.S.A., asset pricing and applying econometric methods to climate change issues like climate change forecasting and sea ice disappearance.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
What’s the difference between MCMC and Variational Inference (VI)? Why is MCMC called an approximate method? When should we use VI instead of MCMC?
These are some of the captivating (and practical) questions we’ll tackle in this episode. I had the chance to interview Charles Margossian, a research fellow in computational mathematics at the Flatiron Institute, and a core developer of the Stan software.
Charles was born and raised in Paris, and then moved to the US to pursue a bachelor’s degree in physics at Yale university. After graduating, he worked for two years in biotech, and went on to do a PhD in statistics at Columbia University with someone named… Andrew Gelman — you may have heard of him.
Charles is also specialized in pharmacometrics and epidemiology, so we also talked about some practical applications of Bayesian methods and algorithms in these fascinating fields.
Oh, and Charles’ life doesn’t only revolve around computers: he practices ballroom dancing and pickup soccer, and used to do improvised musical comedy!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar and Matt Rosinski.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 90 we cover both methodological advances and their application, namely variational inference and MCMC sampling and their application in pharmacometrics.
And we have just the right guest for this topic - Charles Margossian! You might know Charles from his work on STAN, his workshop teaching or his work at his current position at the Flatiron Institute.
His main focus now is on two topics: variational inference and MCMC sampling. When is variational inference (or approximate Bayesian methods) appropriate? And when does it fail? Charles answers these questions convincingly, clearing up some discussion around this topic.
In his work on MCMC, he tries to answer some fundamental questions: How much computational power should we invest? When is MCMC sampling more appropriate than approximate Bayesian methods? The short answer: when you care about quantifying uncertainty. We even talk about what the R-hat measure means and how to improve on it with nested R-hats.
After covering these two topics, we move to his practical work: pharmacometrics. For example, he worked on modelling the speed of drugs dissolving in the body or the role of genetics in the workings of drugs.
Charles also contributes to making Bayesian methods more accessible for pharmacologists: He co-developed the Torsten library for Stan that facilitates Bayesian analysis with pharmacometric data.
We discuss the nature of pharmacometric data and how it is usually modelled with Ordinary Differential Equations.
In the end we briefly cover one practical example of pharmacometric modelling: the Covid-19 pandemic.
All in all, episode 90 is another detailed one, covering many state-of-the-art techniques and their application.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
If you’ve ever tried to lose fat or gain muscle, you may have noticed… it’s not easy. But it’s precisely its complexity that makes the science of exercise and nutrition fascinating.
This is the longest LBS episode so far, and you’ll understand why pretty quickly: we covered a very wide range of topics, starting with the concept of metabolic adaptation and how our physiology and brain react to caloric deficits or caloric surpluses.
We also talked about the connection between metabolic adaptation and exercise energy compensation, shedding light on the interactions between the two, and how they make weight management more complex.
Statistics are of utmost importance in these endeavors, so of course we touched on how Bayesian stats can help mitigate the challenges of low sample sizes and over-focus on average treatment effect.
My guest for this marathon episode, is no other than Eric Trexler. Currently at the Department of Evolutionary Anthropology of Duke University, Eric conducts research on metabolism and cardiometabolic health. He has a PhD in Human Movement Science from UNC Chapel Hill, and has published dozens of peer-reviewed research papers related to exercise, nutrition, and metabolism.
In addition, Eric is a former professional bodybuilder and has been coaching clients with goals related to health, fitness, and athletics since 2009.
In other words, get comfy for a broad and nerdy conversation about the mysteries related to energy expenditure regulation, weight management, and evolutionary mechanisms underpinning current health challenges.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar and Matt Rosinski.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 89, we cover a so-far underrepresented topic on this podcast: Nutrition science, sports science, their relation and of course, the role of Bayesian statistics in that field.
Eric Trexler is the one introducing us to this topic. With his PhD in Human Movement Science from UNC Chapel Hill, previous career as professional bodybuilder and extensive experience as a health and fitness coach, he is perfectly suited for the job.
We cover a lot of ground in this episode, focusing on the science of weight-loss and the challenges to losing weight after a certain point due to an adapted energy expenditure.
We look at energy expenditure and changes in metabolism from several angles, including the evolutionary background for these adaptations and how they affect us in modern times.
We also discuss how individually people react to calorie restriction or surplus, different approaches to motivate oneself to loose weight and the overall complexity of this topic.
In the later half of the episode, we focus more on the scientific practices in sports science and how they can be improved.
One way forward is, of course, to use more Bayesian statistics, especially because of the oftentimes small sample sizes in Eric’s field.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Today, we’re gonna learn about probabilistic numerics — what they are, what they are good for, and how they relate computation and inference in artificial intelligent systems.
To do this, I have the honor of hosting Philipp Hennig, a distinguished expert in this field, and the Chair for the Methods of Machine Learning at the University of Tübingen, Germany. Philipp studied in Heidelberg, also in Germany, and at Imperial College, London. Philipp received his PhD from the University of Cambridge, UK, under the supervision of David MacKay, before moving to Tübingen in 2011.
Since his PhD, he has been interested in the connection between computation and inference. With international colleagues, he helped establish the idea of probabilistic numerics, which describes computation as Bayesian inference. His book, Probabilistic Numerics — Computation as Machine Learning, co-authored with Mike Osborne and Hans Kersting, was published by Cambridge University Press in 2022 and is also openly available online.
So get comfy to explore the principles that underpin these algorithms, how they differ from traditional numerical methods, and how to incorporate uncertainty into the decision-making process of these algorithms.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar and Matt Rosinski.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 88 with Philipp Henning, chair of Methods in Machine Learning at the Eberhard Karls University Tübingen, we learn about new, technical areas for the Bayesian way of thinking: Probabilistic numerics.
Philipp gives us a conceptual introduction to Machine Learning as “refining a model through data” and explains what challenges Machine Learning phases due to the intractable nature of data and the used computations.
The Bayesian approach, emphasising uncertainty over estimates and parameters, naturally lends itself for handling these issues.
In his research group, Philipp tries to find more general implementations of classically used algorithms, while maintaining computational efficiency. They successfully achieve this goal by bringing in the Bayesian approach to inferences.
Philipp explains probabilistic numerics as “redescrbiing everything a computer does as Bayesian inference” and how this approach is suitable for advancing Machine Learning.
We expand on how to handle uncertainty in machine learning and Philipp details his teams approach for handling this issue.
We also collect many resources for those interested in probabilistic numerics and finally talk about the future of this field.
Transcript
This is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I’ll be honest — this episode is long overdue. Not only because Ben Vincent is a friend, fellow PyMC Labs developer, and outstanding Bayesian modeler. But because he works on so many fascinating topics — so I’m all the happier to finally have him on the show!
In this episode, we’re gonna focus on causal inference, how it naturally extends Bayesian modeling, and how you can use the CausalPy open-source package to supercharge your Bayesian causal inference. We’ll also touch on marketing models and the pymc-marketing package, because, well, Ben does a lot of stuff ;)
Ben got his PhD in neuroscience at Sussex University, in the UK. After a postdoc at the University of Bristol, working on robots and active vision, as well as 15 years as a lecturer at the Scottish University of Dundee, he switched to the private sector, working with us full time at PyMC Labs — and that is a treat!
When he’s not working, Ben loves running 5k’s, cycling in the forest, lifting weights, and… learning about modern monetary theory.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
written by Christoph Bamberg
In this podcast episode, our guest, Ben Vincent, a fellow member of PyMC Labs with a PhD in Neuroscience and extensive experience in teaching and data analysis of course, introduces us to CausalPy and PyMC Marketing.
During his academic career, Ben got introduced to Bayesian statistics but, like most academics, did not come across causal inference.
We discuss the importance of a systematic causal approach for important questions like health care interventions or marketing investments.
Although causality is somewhat orthogonal to the choice of statistical approach, Bayesian statistics is a good basis for causal analyses, for example in the for of Directed Acyclical Graphs.
To make causal inference more accessible, Ben developed a Python package called CausalPy, which allows you perform common causal inferences, e.g. working with natural experiments.
Ben was also involved in the development of PyMC Marketing, a package that conveniently bundles important analysis capacities for Marketing. The package focuses on Media Mix Modelling and customer lifetime analysis.
We also talked about his extensive experience teaching statistics at university and current teaching of Bayesian methods in industry. His advice to students is to really engage with your learning material, coding through examples, making the learning more pleasurable and practical.
Transcript
Please note that this is an automated transcript that may contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
This episode is unlike anything I’ve covered so far on the show. Let me ask you: Do you know what a research synchronous language is? What about hybrid systems? Last try: have you heard of Zelus, or ProbZelus?
If you answered “no” to one of the above, then you’re just like me! And that’s why I invited Guillaume Baudart for this episode — to teach us about all these fascinating topics!
A researcher in the PARKAS team of Inria, Guillaume's research focuses on probabilistic and reactive programming languages. In particular, he works on ProbZelus, a probabilistic extension to Zelus, itself a research synchronous language to implement hybrid systems.
To simplify, Zelus is a modeling framework to simulate the dynamics of systems both smooth and subject to discrete dynamics — if you’ve ever worked with ODEs, you may be familiar with these terms.
If you’re not — great, Guillaume will explain everything in the episode! And I know it might sound niche, but this kind of approach actually has very important applications — such as proving that there are no bugs in a program.
Guillaume did his PhD at École Normale Supérieure, in Paris, working on reactive programming languages and quasi-periodic systems. He then worked in the AI programming team of IBM Research, before coming back to the École Normale Supérieure, working mostly on reactive and probabilistic programming.
In his free time, Guillaume loves spending time with his family, playing the violin with friends, and… cooking!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Guillaume Baudart is researcher at Inria in the PARKAS team at the Département d'Informatique (DI) of the École normale supérieure. He joins us for episode 86 to tell us about ProbZelus, a synchronous probabilistic programming language, that he develops.
We have not covered synchronous languages yet, so, Guillaume gives us some context on this kind of programming approach and how ProbZelus adds probabilistic notions to it.
He explains the advantages of the probabilistic aspects of ProbZelus and what practitioners may profit from it.
For example, synchronous languages are used to program and test autopilots of planes and ensure that they do not have any bugs. ProbZelus may be useful here as Guillaume argues.
Finally, we also touch upon his teaching work and what difficulties he encounters in teaching probabilistic programming.
Transcript
Please note that this is an automated transcript that may contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
In this episode, I am honored to talk with a legend of sports analytics in general, and baseball analytics in particular. I am of course talking about Jim Albert.
Jim grew up in the Philadelphia area and studied statistics at Purdue University. He then spent his entire 41-year academic career at Bowling Green State University, which gave him a wide diversity of classes to teach – from intro statistics through doctoral level.
As you’ll hear, he’s always had a passion for Bayesian education, Bayesian modeling and learning about statistics through sports. I find that passion fascinating about Jim, and I suspect that’s one of the main reasons for his prolific career — really, the list of his writings and teachings is impressive; just go take a look at the show notes.
Now an Emeritus Professor of Bowling Green, Jim is retired, but still an active tennis player and writer on sports analytics — his blog, “Exploring Baseball with R”, is nearing 400 posts!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
written by Christoph Bamberg
In this episode, Jim Albert, a legend of sports analytics, Emeritus Professor at Bowling Green university, is our guest.
We talk about a range of topics, including his early interest in math and sports, challenges in analysing sports data and his experience teaching statistics.
We trace back the history of baseball sport analytics to the 1960s and discuss how new, advanced ways to collect data change the possibilities of what can be modelled.
There are also statistical approaches to American football, soccer and basketball games. Jim explains why these team sports are more difficult to model than baseball.
The conversation then turns to Jim’s substantial experience teaching statistics ad the challenges he sees in that. Jim worked on several books on sports analytics and has many blog posts on this topic.
We also touch upon the challenges of prior elicitation, a topic that has come up frequently in recent podcasts, how different stakeholders such as coaches and managers think differently about the sport and how to extract priors from their information.
For more tune in to episode 85 with Jim Albert.
Chapters
[00:00:00] Episode Begins
[00:04:04] How did you get into the world of statistics?
[00:11:17] Baseball is more advanced on the analytics path compared to other sports
[00:17:02] How is the data collected?
[00:24:43] Why is sports analytics important and is it turning humans into robots?
[00:32:51] Loss in translation problem between modellers and domain experts...?
[00:41:43] Active learning and learning through workshops
[00:51:08] Principles before methods
[00:52:30] Your favorite sports analytics model
[01:02:07] If you had unlimited time and resources which problem would you try to solve?
Transcript
Please note that this transcript is generated automatically and may contain errors. Feel free to reach out if you are willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
This is another installment in our neuroscience modeling series! This time, I talked with Konrad Kording, about the role of Bayesian stats in neuroscience and psychology, electrophysiological data to study what neurons do, and how this helps explain human behavior.
Konrad studied at ETH Zurich, then went to UC London and MIT for his postdocs. After a decade at Northwestern University, he is now Penn Integrated Knowledge Professor at the University of Pennsylvania.
As you’ll hear, Konrad is particularly interested in the question of how the brain solves the credit assignment problem and similarly how we should assign credit in the real world (through causality). Building on this, he is also interested in applications of causality in biomedical research.
And… he’s also a big hiker, skier and salsa dancer!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony and Joshua Meehl.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Transcript
Please note that this is an automatic transcript and may contain errors. Feel free to reach out if you would like to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
One of the greatest features of this podcast, and my work in general, is that I keep getting surprised. Along the way, I keep learning, and I meet fascinating people, like Tarmo Jüristo.
Tarmo is hard to describe. These days, he’s heading an NGO called Salk, in the Baltic state called Estonia. Among other things, they are studying and forecasting elections, which is how we met and ended up collaborating with PyMC Labs, our Bayesian consultancy.
But Tarmo is much more than that. Born in 1971 in what was still the Soviet Union, he graduated in finance from Tartu University. He worked in finance and investment banking until the 2009 crisis, when he quit and started a doctorate in… cultural studies. He then went on to write for theater and TV, teaching literature, anthropology and philosophy. An avid world traveler, he also teaches kendo and Brazilian jiu-jitsu.
As you’ll hear in the episode, after lots of adventures, he established Salk, and they just used a Bayesian hierarchical model with post-stratification to forecast the results of the 2023 Estonian parliamentary elections and target the campaign efforts to specific demographics.
Oh, and let thing: Tarmo is a fan of the show — I told you he was a great guy ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh and Grant Pezzolesi.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 83 of the podcast Tarmo Jüristo is our guest. He recently received media attention for his electoral forecasting in the Estonian election and potential positive role in aiding liberal parties gain more votes than expected.
Tarmo explains to us how he used Bayesian models with his NGO SALK to forecast the election and how he leveraged these models to unify the different liberal parties that participated in the election. So, we get a firsthand view of how to use Bayesian modelling smartly.
Furthermore, we talk about when to use Bayesian models, difficulties in modelling survey data and how post-stratification can help.
He also explains how he, with the help of PyMC Labs, added Gaussian Processes to his models to better model the time-series structure of their survey data.
We close this episode by discussing the responsibility that comes with modelling data in politics.
Transcript
please note that the transcript was generated automatically and may therefore contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
------------------------------------------------------------------------------
Max Kochurov’s State of Bayes Lecture Series: https://www.youtube.com/playlist?list=PL1iMFW7frOOsh5KOcfvKWM12bjh8zs9BQ
Sign up here for upcoming lessons: https://www.meetup.com/pymc-labs-online-meetup/events/293101751/
------------------------------------------------------------------------------
We talk a lot about different MCMC methods on this podcast, because they are the workhorses of the Bayesian models. But other methods exist to infer the posterior distributions of your models — like Sequential Monte Carlo (SMC) for instance. You’ve never heard of SMC? Well perfect, because Nicolas Chopin is gonna tell you all about it in this episode!
A lecturer at the French university of ENSAE since 2006, Nicolas is one of the world experts on SMC. Before that, he graduated from Ecole Polytechnique and… ENSAE, where he did his PhD from 1999 to 2003.
Outside of work, Nicolas enjoys spending time with his family, practicing aikido, and reading a lot of books.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady and Kurt TeKolste.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 82 Nicolas Chopin is our guest. He is a graduate from the Ecole Polytechnique and currently lectures at the French university of ENSAE.
He is a specialist for Sequential Monte Carlo (SMC) samplers and explains in detail what they are, clearing up some confusion about what SMC stands for and when to use them.
We discuss the advantages of SMC over other types of commonly used samplers for bayesian models such as MCMC or Gibbs samplers.
Besides a detailed look at SMC we also cover INLA. INLA stands for Integrated Nested LaPlace Approximation.
INLA can be a fast, approximate sampler for specific kinds of models. It works well for geographic data and relationships, such as for example relationships between regions in a country.
We discuss the difficulties with and future of SMC and INLA and probabilistic sampling in general.
Transcript
please note that the transcript was generated automatically and may therefore contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Did you know that the way your brain perceives speed depends on your priors? And it’s not the same at night? And it’s not the same for everybody?
This is another of these episodes I love where we dive into neuroscience, how the brain works, and how it relates to Bayesian stats. It’s actually a follow-up to episode 77, where Pascal Wallisch told us how the famous black and blue dress tells a lot about our priors about how we perceive the world. So I strongly recommend listening to episode 77 first, and then come back here, to have your mind blown away again, this time by Alan Stocker.
Alan was born and raised in Switzerland. After a PhD in physics at ETH Zurich, he somehow found himself doing neuroscience, during a postdoc at NYU. And then he never stopped — still leading the Computational Perception and Cognition Laboratory of the University of Pennsylvania.
But Alan is also a man of music (playing the piano when he can), a man of coffee (he’ll never refuse an olympia cremina or a kafatek) and a man of the outdoors (he loves trashing through deep powder with his snowboard).
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady and Kurt TeKolste.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 81 of the podcast, Alan Stocker helps us update our priors of how the brain works. Alan, born in Switzerland, studied mechanical engineering and earned his PhD in physics before being introduced to the field of neuroscience through an internship. He is now Associate Professor at the University of Pennsylvania.
Our conversation covers various topics related to the human brain and whether it what it does can be characterised as a Bayesian inferences.
Low-level visual processing, such as identifying the orientation of moving grids, can be explained with reference to Bayesian priors and updating under uncertainty. We go through several examples of this such as driving a car in foggy conditions.
More abstract cognitive processes, such as reasoning about politics, may be more difficult to explain in Bayesian terms.
We also touch upon the question to what degree priors may be innate and how to educate people to change their priors.
In the end, Alan gives two recommendations for improving your Bayesian inferences in a political context: 1) Go out and get your own feedback and 2) try to give and receive true feedback. Listen to the episode for details.
Transcript
please note that the transcript was generated automatically and may therefore contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I’m sure you know at least one Bart. Maybe you’ve even used one — but you’re not proud of it, because you didn’t know what you were doing. Thankfully, in this episode, we’ll go to the roots of regression trees — oh yeah, that’s what BART stands for. What were you thinking about?
Our tree expert will be no one else than Sameer Deshpande. Sameer is an assistant professor of Statistics at the University of Wisconsin-Madison. Prior to that, he completed a postdoc at MIT and earned his Ph.D. in Statistics from UPenn.
On the methodological front, he is interested in Bayesian hierarchical modeling, regression trees, model selection, and causal inference. Much of his applied work is motivated by an interest in understanding the long-term health consequences of playing American-style tackle football. He also enjoys modeling sports data and was a finalist in the 2019 NFL Big Data Bowl.
Outside of Statistics, he enjoys cooking, making cocktails, and photography — sometimes doing all of those at the same time…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, and Arkady.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In episode 80, Sameer Deshpande, assistant professor of Statistics at the University of Wisconsin-Madison is our guest.
He had a passion for math from a young age. And got into Bayesian statistics at university, teaching statistics now himself. We talk about the intricacies of teaching bayesian statistics, such as helping students accept that there are no objective answers.
Sameer’s current work focuses on Bayesian Additive Regression Trees (BARTs). He also works on prior specification, and numerous cool applied projects, for example on the effects of playing American football as an adolescent and its effects for later health
We primarily talk about BARTs as a way of approximating complex functions by using a collection of step functions. They work off the shelf pretty well and can be applied to various models such as survival models, linear models, and smooth models. BARTs are somewhat analogous to splines and can capture trajectories well over time. However, they are also a bit like a black box making them hard to interpret.
We further touch upon some of his work on practical problems, such as how cognitive processes change over time or models of baseball empires’ decision making.
Transcript
Please note that the following transcript was generated automatically and may therefore contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Decision-making and cost effectiveness analyses rarely get as important as in the health systems — where matters of life and death are not a metaphor. Bayesian statistical modeling is extremely helpful in this field, with its ability to quantify uncertainty, include domain knowledge, and incorporate causal reasoning.
Specialized in all these topics, Gianluca Baio was the person to talk to for this episode. He’ll tell us about this kind of models, and how to understand them.
Gianluca is currently the head of the department of Statistical Science at University College London. He studied Statistics and Economics at the University of Florence (Italy), and completed a PhD in Applied Statistics, again at the beautiful University of Florence.
He’s also a very skilled pizzaiolo — so now I have two reasons to come back to visit Tuscany…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, and Arkady.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In this week’s episode, I talk to Gianluca Baio. He is the head of the department of Statistical Science at University College London and earned a MA and PhD in Florence in Statistics and Economics.
His work primarily focuses on Bayesian modeling for decision making in healthcare, for example in case studies for novel drugs and whether this alternative treatment is worth the cost. Being a relatively young field, health economics seems more open to Bayesian statistics than more established fields.
While Bayesian statistics becomes more common in clinical trial research, many regulatory bodies still prefer classical p-values. Nonetheless, a lot of COVID modelling was done using Bayesian statistics.
We also talk about the purpose of statistics, which is not to prove things but to reduce uncertainty.
Gianluca explains that proper communication is important when eliciting priors and involving people in model building.
The future of Bayesian statistics is that statistics should have more primacy, and he hopes that statistics will stay central rather than becoming embedded in other approaches like data science, notwithstanding, communication with other disciplines is crucial.
Transcript
Please note that the following transcript was generated automatically and may therefore contain errors. Feel free to reach out if you're willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Matt Hoffman has already worked on many topics in his life – music information retrieval, speech enhancement, user behavior modeling, social network analysis, astronomy, you name it.
Obviously, picking questions for him was hard, so we ended up talking more or less freely — which is one of my favorite types of episodes, to be honest.
You’ll hear about the circumstances Matt would advise picking up Bayesian stats, generalized HMC, blocked samplers, why do the samplers he works on have food-based names, etc.
In case you don’t know him, Matt is a research scientist at Google. Before that, he did a postdoc in the Columbia Stats department, working with Andrew Gelman, and a Ph.D at Princeton, working with David Blei and Perry Cook.
Matt is probably best known for his work in approximate Bayesian inference algorithms, such as stochastic variational inference and the no-U-turn sampler, but he’s also worked on a wide range of applications, and contributed to software such as Stan and TensorFlow Probability.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode and Gabriel Stechschulte.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
written by Christoph Bamberg
In this episode, Matt D. Hoffman, a Google research scientist discussed his work on probabilistic sampling algorithms with me. Matt has a background in music information retrieval, speech enhancement, user behavior modeling, social network analysis, and astronomy.
He came to machine learning (ML) and computer science through his interest in synthetic music and later took a Bayesian modeling class during his PhD.
He mostly works on algorithms, including Markov Chain Monte Carlo (MCMC) methods that can take advantage of hardware acceleration, believing that running many small chains in parallel is better for handling autocorrelation than running a few longer chains.
Matt is interested in Bayesian neural networks but is also skeptical about their use in practice.
He recently contributed to a generalised Hamilton Monte Carlo (HMC) sampler, and previously worked on an alternative to the No-U-Turn-Sampler (NUTS) called MEADS. We discuss the applications for these samplers and how they differ from one another.
In addition, Matt introduces an improved R-hat diagnostic tool, nested R-hat, that he and colleagues developed.
Automated Transcript
Please note that the following transcript was generated automatically and may therefore contain errors. Feel free to reach out if you’re willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I love dresses. Not on me, of course — I’m not nearly elegant enough to pull it off. Nevertheless, to me, dresses are one of the most elegant pieces of clothing ever invented.
And I like them even more when they change colors. Well, they don’t really change colors — it’s the way we perceive the colors that can change. You remember that dress that looked black and blue to some people, and white and gold to others? Well that’s exactly what we’ll dive into and explain in this episode.
Why do we literally see the world differently? Why does that even happen beyond our consciousness, most of the time? And cherry on the cake: how on Earth could this be related to… priors?? Yes, as in Bayesian priors!
Pascal Wallisch will shed light on all these topics in this episode. Pascal is a professor of Psychology and Data Science at New York University, where he studies a diverse range of topics including perception, cognitive diversity, the roots of disagreement and psychopathy.
Originally from Germany, Pascal did his undergraduate studies at the Free University of Berlin. He then received his PhD from the University of Chicago, where he studied visual perception.
In addition to scientific articles on psychology and neuroscience, he wrote multiple books on scientific computing and data science. As you’ll hear, Pascal is a wonderful science communicator, so it's only normal that he also writes for a general audience at Slate or the Creativity Post, and has given public talks at TedX and Think and Drink.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R and Nicolas Rode.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In our conversation, Pascal Wallisch, a professor of Psychology and Data Science at New York University, shared about his research on perception, cognitive diversity, the roots of disagreement, and psychopathy.
Pascal did his undergraduate studies at the Free University of Berlin and then received his PhD from the University of Chicago, where he studied visual perception. Pascal is also a TedX, Think and Drink speaker, and writer for Slate and Creativity Post.
We discussed Pascal's origin story, his current work on cognitive diversity, and the importance of priors in perception.
Pascal used the example of "the Dress" picture that went viral in 2015, where people saw either black and blue or white and gold. He explained how prior experience and knowledge can affect how people perceive colors and motion, and how priors can bias people for action.
We discussed to what extent the brain might be Bayesian and what functions are probably not so well described in bayesian terms.
Pascal also discussed how priors can be changed through experience and exposure.
Finally, Pascal emphasized that people have different priors and perspectives, and that understanding these differences is crucial for creating a more diverse and inclusive society.
Automated Transcript
Please note that the following transcript was generated automatically and may therefore contain errors. Feel free to reach out if you’re willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
How does it feel to switch careers and start a postdoc at age 47? How was it to be one of the people who created the probabilistic programming language Stan? What should the Bayesian community focus on in the coming years?
These are just a few of the questions I had for my illustrious guest in this episode — Bob Carpenter. Bob is, of course, a Stan developer, and comes from a math background, with an emphasis on logic and computer science theory. He then did his PhD in cognitive and computer sciences, at the University of Edinburgh.
He moved from a professor position at Carnegie Mellon to industry research at Bell Labs, to working with Andrew Gelman and Matt Hoffman at Columbia University. Since 2020, he's been working at Flatiron Institute, a non-profit focused on algorithms and software for science.
In his free time, Bob loves to cook, see live music, and play role playing games — think Monster of the Week, Blades in Dark, and Fate.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin and Raphaël R.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In this episode, you meet the man behind the code. Namely, Bob Carpenter, one of the core developers of STAN, a popular statistical programming language.
After working in computational linguistic for some time, Bob became a PostDoc with Andrew Gellman to really learn Statistics and Modelling.
There he and a small team developed the first implementation of STAN. We talk about the challenges associated with the team growing and the Open Source conventions.
Besides the initial intention behind and the beginning of STAN, we talk about the future of probabilistic programming.
Creating a tool for people with different degrees of mathematics and programming knowledge is a big challenge and working with these tools may also be more difficult for the user.
We discuss why Bayesian statistical programming is popular nonetheless and what makes it uniquely adequate for research.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
If you’re a nerd like me, you’re always curious about the physics of any situation. So, obviously, when I watched Top Gun 2, I became fascinated by the aerodynamics of fighters jets. And it so happens that one of my friends used to be a fighter pilot for the Canadian army… Immediately, I thought this would make for a cool episode — and here we are!
Actually, Jason Berndt wanted to be a pilot from the age of 3. When he was 6, he went to an air show, and then specifically wanted to become a fighter pilot. In his teens, he learned how to fly saliplanes, small single engine aircrafts. At age 22, he got a bachelor’s in aero engineering from the royal military college, and then — well, he’ll tell you the rest in the episode.
Now in his thirties, he owns real estate and created his own company, My Two Brows, selling temporary eyebrow tattoos — which, weirdly enough, is actually related to his time in the army…
In his free time, Jason plays the guitar, travels around the world (that’s actually how we met), and loves chasing adrenaline however he can (paragliding, scuba diving, you name it!).
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin and Raphaël R.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
In this episode of the Learning bayesian statistics podcast we do not talk about Bayesianism, let alone statistics. Instead we dive into the world of fighter jets and Top Gun pilots with Jason Berndt.
Jason is a former fighter jet pilot turned entrepreneur. He looks back at his time as a pilot, how he got there, the challenges and thrills of this job and how it influences him now in his new life.
We also touch upon physics and science related aspects like G-force, centrifugal power, automation in critical environments like flying a fighter jet and human-computer interaction.
Jason discusses the recent movie Top Gun: Maverick and how realistic the flying was as well as the description of the fighter pilots’ lives.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
We need to talk. I had trouble writing this introduction. Not because I didn’t know what to say (that’s hardly ever an issue for me), but because a conversation with Adrian Seyboldt always takes deliciously unexpected turns.
Adrian is one of the most brilliant, interesting and open-minded person I know. It turns out he’s courageous too: although he’s not a fan of public speaking, he accepted my invitation on this show — and I’m really glad he did!
Adrian studied math and bioinformatics in Germany and now lives in the US, where he enjoys doing maths, baking bread and hiking.
We talked about the why and how of his new project, Nutpie, a more efficient implementation of the NUTS sampler in Rust. We also dived deep into the new ZeroSumNormal distribution he created and that’s available from PyMC 4.2 onwards — what is it? Why would you use it? And when?
Adrian will also tell us about his favorite type of models, as well as what he currently sees as the biggest hurdles in the Bayesian workflow.
Each time I talk with Adrian, I learn a lot and am filled with enthusiasm — and now I hope you will too!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey and Andreas Kröpelin.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Adrian Seyboldt, the guest of this week’s episode, is an active developer of the PyMC library in Python and his new tool nutpie in Rust. He is also a colleague at PyMC-Labs and friend. So naturally, this episode gets technical and nerdy.
We talk about parametrisation, a topic important for anyone trying to implement a Bayesian model and what to do or avoid (don't use the mean of the data!).
Adrian explains a new approach to setting categorical parameters, using the Zero Sum Normal Distribution that he developed. The approach is explained in an accessible way with examples, so everyone can understand and implement it themselves.
We also talked about further technical topics like initialising a sampler, the use of warm-up samples, mass matrix adaptation and much more. The difference between probability theory and statistics as well as his view on the challenges in Bayesian statistics complete the episode.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I’m guessing you already tried to communicate the results of a statistical model to non-stats people — it’s hard, right? I’ll be honest: sometimes, I even prefer to take notes during meetings than doing that… But shhh, that’s out secret.
But all of this was before. Before I talked with Jessica Hullman. Jessica is the Ginny Rometty associate professor of computer science at Northwestern University.
Her work revolves around how to design interfaces to help people draw inductive inferences from data. Her research has explored how to best align data-driven interfaces and representations of uncertainty with human reasoning capabilities, which is what we’ll mainly talk about in this episode.
Jessica also tries to understand the role of interactive analysis across different stages of a statistical workflow, and how to evaluate data visualization interfaces.
Her work has been awarded with multiple best paper and honorable mention awards, and she frequently speaks and blogs on topics related to visualization and reasoning about uncertainty — as usual, you’ll find the links in the show notes.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox and Trey Causey.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
General links from the show:
Some of Jessica’s research that she mentioned:
Behavioral economics paper Jessica mentioned:
More on David Blackwell:
Abstract:
Professor Jessica Hullman from Northwestern University is an expert in designing visualisations that help people learn from data and not fall prey to biases.
She focuses on the proper communication of uncertainty, both theoretically and empirically.
She addresses questions like “Can a Bayesian model of reasoning explain apparently biased reasoning?”, “What kind of visualisation guides readers best to a valid inference?”, “How can biased reasoning be so prevalent - are there scenarios where not following the canonical reasoning steps is optimal?”.
In this episode we talk about her experimental studies on communication of uncertainty through visualisation, in what scenarios it may not be optimal to focus too much on uncertainty and how we can design models of reasoning that can explain actual behaviour and not discard it as biased.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
What happens inside a black hole? Can we travel back in time? Why is the Universe even here? This is the type of chill questions that we’re all asking ourselves from time to time — you know, when we’re sitting on the beach.
This is also the kind of questions Daniel Whiteson loves to talk about in his podcast, “Daniel and Jorge Explain the Universe”, co-hosted with Jorge Cham, the author of PhD comics. Honestly, it’s one of my favorite shows ever, so I warmly recommend it. Actually, if you’ve ever hung out with me in person, there is a high chance I started nerding out about it…
Daniel is, of course, a professor of physics, at the University of California, Irvine, and also a researcher at CERN, using the Large Hadron Collider to search for exotic new particles — yes, these are particles that put little umbrellas in their drinks and taste like coconut.
On his free time, Daniel loves reading, sailing and baking — I can confirm that he makes a killer Nutella roll!
Oh, I almost forgot: Daniel and Jorge wrote two books — We Have No Idea and FAQ about the Universe — which, again, I strongly recommend. They are among my all-time favorites.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bert≈rand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek and Paul Cox.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Big questions are tackled in episode 72 of the Learning Bayesian Statistics Podcast: “What is the nature of the universe?”, “What is the role of science?”, “How are findings in physics created and communicated?”, “What is randomness actually?”. This episode’s guest, Daniel Whitesun, is just the right person to address these questions.
He is well-known for his own podcast “Daniel and Jorge Explain the Universe”, wrote several popular science books on physics and works as a particle physicist with data from the particle physics laboratory CERN.
He manages to make sense of Astrology, although he is not much of a star-gazer himself. Daniel prefers to look for weird stuff in the data of colliding particles and ask unexpected questions.
This comes with great statistical challenges that he tackles with Bayesian statistics and machine learning, while he also subscribes to the frequentist philosophy of statistics.
In the episode, Alex and Daniel touch upon many of the great ideas in quantum physics, the Higgs boson, Schrödinger’s cat, John Bell’s quantum entanglement discoveries, true random processes and much more. Mixed in throughout are pieces of advice for anyone scientifically-minded and curious about the universe.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
This episode will show you different sides of the tech world. The one where you research and apply algorithms, where you get super excited about image recognition and AI-generated art. And the one where you support social change actors — aka the “AI for Good” movement.
My guest for this episode is, quite naturally, Julien Cornebise. Julien is an Honorary Associate Professor at UCL. He was an early researcher at DeepMind where he designed its early algorithms. He then worked as a Director of Research at ElementAI, where he built and led the London office and “AI for Good” unit.
After his theoretical work on Bayesian methods, he had the privilege to work with the NHS to diagnose eye diseases; with Amnesty International to quantify abuse on Twitter and find destroyed villages in Darfur; with Forensic Architecture to identify teargas canisters used against civilians.
Other than that, Julien is an avid reader, and loves dark humor and picking up his son from school at the 'hour of the daddies and the mommies”.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken, Or Duek, Pavel Dusek and Paul Cox.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Julien Cornebise goes on a deep dive into deep learning with us in episode 71. He calls himself a “passionate, impact-driven scientist in Machine Learning and Artificial Intelligence”. He holds an Honorary Associate Professor position at UCL, was an early researcher at DeepMind, went on to become Director of Research at ElementAI and worked with institutions ranging from the NHS in Great-Britain to Amnesty International.
He is a strong advocate for using Artificial Intelligence and computer engineering tools for good and cautions us to think carefully about who we develop models and tools for. Ask the question: What could go wrong? How could this be misused? The list of projects where he used his computing skills for good is long and divers: With the NHS he developed methods to measure and diagnose eye diseases. For Amnesty International he helped quantify the abuse female journalists receive on Twitter, based on a database of tweets labeled by volunteers.
Beyond these applied projects, Julien and Alex muse about the future of structured models in times of more and more popular deep learning approaches and the fascinating potential of these new approaches. He advices anyone interested in these topics to be comfortable with experimenting by themselves and potentially breaking things in a non-consequential environment.
And don’t be too intimidated by more seasoned professionals, he adds, because they probably have imposter-syndrome themselves which is a sign of being aware of ones own limitations.
Automated Transcript
Please note that the following transcript was generated automatically and may therefore contain errors. Feel free to reach out if you’re willing to correct them.
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Back in 2016, when I started dedicating my evenings and weekends to learning how to code and do serious stats, I was a bit lost… Where do I start? Which language do I pick? Why are all those languages just named with one single letter??
Then I found some stats classes by Justin Bois — and it was a tremendous help to get out of that wood (and yes, this was a pun). I really loved Justin’s teaching because he was making the assumptions explicit, and also explained them — which was so much more satisfying to my nerdy brain, which always wonders why we’re making this assumption and not that one.
So of course, I’m thrilled to be hosting Justin on the show today! Justin is a Teaching Professor in the Division of Biology and Biological Engineering at Caltech, California, where he also did his PhD. Before that, he was a postdoc in biochemistry at UCLA, as well as the Max Planck Institute in Dresden, Germany.
Most importantly for the football fans, he’s a goalkeeper — actually, the day before recording, he saved two penalty kicks… and even scored a goal! A big fan of Los Angeles football club, Justin is a also a magic enthusiast — he is indeed a member of the Magic Castle…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas, Robert Yolken and Or Duek.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Abstract
Justin Bois did his Bachelor and PhD in Chemical Engineering before working as a Postdoctoral Researcher in Biological Physics, Chemistry and Biological Engineering. He now works as a Teaching Professor at the division of Biology and Biological Engineering at Caltech, USA.
He first got into Bayesian Statistics like many scientists in fields like biology or psychology, by wanting to understand what the statistics actually mean that he was using. His central question was “what is parameter estimation actually?”. After all, that’s a lot of what doing quantitative science is on a daily basis!
The Bayesian framework allowed him to find an answer and made him feel like a more complete scientist. As a teaching professor, he is now helping students of life sciences such as neuroscience or biological engineering to become true Bayesians.
His teaching covers what you need to become a proficient Bayesian analyst, from opening datasets to Bayesian inference. He emphasizes the importance of models implicit in quantitative research and shows that we do in most cases have a prior idea of an estimand’s magnitude.
Justin believes that we are naturally programmed to think in a Bayesian framework but still should mess up sometimes to learn that statistical techniques are fragile. You can find some of his teaching on his website.
Transcript
This transcript was generated automatically. Some transcription errors may have remained. Feel free to reach out if you're willing to correct them.
[00:00:00] In 2016, when I started dedicating my evenings and weekends to learning how to code and do serious stats, I was a bit lost, to be honest. Where do I start? Which language do I speak? Why are all those languages just named with one single letter, like R or C? Then I found some stats classes by just in voice.
And it was a tremendous help to get out of that wood. And yes, this was a pun. I really enjoyed Justine's teaching because he was making the assumptions explicit, and he also explained them, which was so much more satisfying to my minority brain, which always wonders why we're making this assumption and not that one.
So of course, I'm thrilled to be hosting Justin on the show today. Justin is a teaching professor in the division of biology and biological engineering at Caltech, California, where he also did his PhD. Before that, he was a postdoc in biochemistry at UCLA as well as the Max Plan Institute in Tris, Germany.
Most importantly, for the football fans, Justin is a goalkeeper. [00:01:00] Actually, the day before recording, he saved two penalty, penalty, kicks, and even scored a goal. Yes, a big fan of Los Angeles's football club. Justine is also a magic enthusiast. He is indeed a member of the Magic Castle. This is Learning Patient Statistics.
Ex episode 70, recorded September 2nd, 2022. Welcome to Learning Patient Statistics, a fortnightly podcast on Beijing Inference, The methods project in the People who Make Impossible. I'm your host, Alex Andora. You can follow me Twitter at ann underscore like the country. For any info about the podcast, learn base stats.com is lap less to be Show notes becoming corporate sponsor supporting lbs and Pat.
Unlocking base merge, everything is in there. That's learn base dance.com. If with all that info, a model is still resisting you, or if you find my voice special, smooth and [00:02:00] want me to come and teach patient stats in company, then reach out at [email protected] or book call with me at learnbayesstats.com.
Thanks a lot folks. And best patient wish shes to you old. Let me show you how to be a good bla and change your predictions after taking information and, and if you're thinking they'll be less than amazing, let's adjust those expectations. What's a basian is someone who cares about evidence and doesn't jump to assumptions based on intuitions and prejudice.
Abassian makes predictions on the best available info and adjusts the probability cuz every belief is provisional. And when I kick a flow, mostly I'm watching eyes widen. Maybe cuz my likeness lowers expectations of tight ryman. How would I know unless I'm Ryman in front of a bunch of blind men, drop in placebo controlled science like I'm Richard Feinman, just in boys.
Welcome to Learning Patient St Sticks. Thank you. Happy to be here. Yes. Very [00:03:00] happy to have you here because, well, you know that, but listeners do not. But you are actually one of the first people who introduced me back to, uh, statistics and programming in 2017 when I started my Carrie Shift. So it's awesome to have you here today.
I'm glad my stuff helped you get going. That's, that's the point. That's the goal. Yeah. Yeah, that's really cool. And also, I'm happy to have learned how you pronounce your last name because in French, you know, that's a French name. I dunno if you have some French origin, but in French it means, I know, I know it's a French name, but it's actually, as far as I understand, my family's from Northern Germany and there's a, a name there that's spelled b e u s s, like, and it's pronounced like in Germany, you say Boce.
And then it got anglicized, I think when I moved to the US but uh, I was actually recently, just this past summer in Luanne, Switzerland, and there was a giant wood recycling bin. With my name on it, , it said d i s. So I got my picture taken next to that. So yeah. Yeah. Lo Zen is in the French speaking part of Switzerland.[00:04:00]
That's right. Cool. So we're starting already with the origin story, so I love that cuz it's actually always my first question. So how did you jump to the stats in biology worlds and like how Senior of a Pass read it? Well, I think the path that I had toward really thinking carefully about statistical inferences is a very common path among scientists, meaning scientists outside of data scientists and, and maybe also outside of really data rich branches of sciences such as astronomy.
So I studied chemical engineering as an undergraduate. It was a standard program. I didn't really do any undergrad research or anything, but I got into a little bit of statistics when I had a job at Kraft Foods. After undergraduate where I worked at the statistician on doing some predictive modeling about, uh, some food safety issues.
And I thought it was interesting, but I sort of just, I was an engineer. I was making the product, I was implementing the stuff in the production facility and the statistician kind of took care of [00:05:00] everything else. I thought, I thought he was one of the coolest people in the company, . Um, but I didn't really, you know, it didn't really hook me in to really thinking about that.
But I went and did a PhD and my PhD really didn't involve really much experimentation at all. I was actually doing computational modeling of like how nucleic acids get their structure and shape and things. And that was, it just didn't really involve analysis of much data. Then in my post-doctoral studies, in my post-doctoral work, I was working with some experimentalists who had some data sets and they needed.
do estimates of parameters based on some theoretical models that I had derived or worked on. And I had done some stuff and you know, various lab classes and stuff, but it's your standard thing. It's like, ooh, I know how to do a cur fit. Meaning I can, I guess in the Python way I would do it, SciPi dot optimized dot cur fit.
Or you know, in MATLAB I could do at least squares or something like that. And, and I knew this idea of minimizing the sum of the square of the residuals and that's gonna get you [00:06:00] a line that looks close to what your data points are. But the inference problems, the theoretical curves were actually a little bit say for some of 'em.
There was no close to form solution. They were actually solutions to differential equations. And so the actual theoretical treatment I had was a little bit more complicated. And so I needed to start to think a little bit more carefully about exactly how we're going about estimating the parameters thereof.
Right? And so I kind of just started grabbing uh, books and I. Discovered quickly that I had no idea what I was doing, , and actually neither did anybody around me. And I don't mean that pejoratively, it's just, it's a very common thing among the scient. A lot of people in the sciences that aren't, that don't work as much with data.
And perhaps it's less common now, but it's definitely more common than, you know, 10, 15, uh, years ago. And so I just kind of started looking into how we should actually think about the estimates of [00:07:00] parameters given a data set. And really what happened was the problem became crystallized for me, the problem of parameter estimation.
And I had never actually heard that phrase, perimeter estimation. To me. It was find the best fit per. If your curve goes through your data point, that means that you're, the theory that you derived is probably pretty good. And of course, I didn't think about what the word probably meant there. I, I only knew it colloquially, right?
And so, cuz I was focused on deriving what the theory is. And of course that's a whole, hugely important part of, of the scientific enterprise. But once you get that theory arrived to try to estimate the parameters of that are present in that theory from measurement, that problem just became clear to me.
Once I had a clear problem statement, then I was able to start to think about how to solve it. And so the problem statement was, I have a theory that has a set of parameters. I want to try to figure out what the parameters are by taking [00:08:00] some measurements and checking for one set of parameters. The measurements would be different.
How do I find what parameters there are to, to give me this type, type of data that I observe. I intentionally just stated that awkwardly because that awkwardness there sort of made the, It's funny, it made it clear to me that the problem was unclear . And, and so I, that's what got me into a basian mode of thinking because it was hard for me to wrap my head around what it meant to do that thing that I've been doing all this time.
This minimizing some squares of residuals and trying to find the best fit parameter. And, you know, in retrospect now I've actually, you know, that I taught myself. Cause I didn't really ever have a course in statistical inference or anything like that, say Okay. I was essentially doing a maximum likelihood estimation, which is a f way of doing prime destination.
And I, and I hadn't actually thought about what that meant. I mean, I understand that now. We don't really need to talk [00:09:00] about that since we're talking about BA stuff now, but, and it was just harder for me to wrap my head around what that meant. And so I started reading. About the basing interpretation of probability, and it was really, it really just crystallized everything and made it clear, and then I could state the problem much more clearly.
The problem was I was trying to find a posterior probability density function for these parameters given the data, and that was just so much clearly stated in Baying framework, and then that kinda lit me on fire because I was like, Holy cow, this thing that we do so often in the scientific enterprise, I can actually state the question , right?
And I just thought that was such a profound moment, and then I was kind of hooked from there on out and I, I was concent trying to improve how I thought about these things. And yeah, so I did a lot of reading. I realized I just talked a lot. You probably have [00:10:00] some questions about some of the stuff I just said, so please.
Oh yeah, well wait. But, um, I mean, that's good to have a, an overview like that. And so I guess that's also like, it sounds like you were introduced to patient statistics at the same time as you were doing that deep dive into, wait, like, I'm not sure I understand what I'm using then. Oh, actually I don't understand anything and then I have to learn about that.
But it seems that you, you were also introduced to patient stats at that same time, Is that right? Yeah, I think so. And I think this is actually sort of a classic way in which scientists come up with what it is that they want to study. Because instead you start poking around, you kind of don't really know where the holes in your knowledge are.
And so what I saw was like just a giant hole in my knowledge and my toolbox, and I saw the hole and I said, All right, let's fill it . And um, and so then I just started feeling around on how to do that. I see. And I am also curious as [00:11:00] to, and what motivated you to dive into the Beijing way of doing things?
I really do think it was the clarity. I think that, Okay. I think that arguing about like what interpretation or probability you wanna use is not the most fruitful way to spend one's time. For me, it was really, it was just so much more intuitive. I felt like I could have this interpretation of probability that it's, it's a quantification of the plausibility of a logical conjecture of any logical conjecture gave me sort of the flexibility where I could think about like a...
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
A great franchise comes with a great rivalry: Marvel has Iron Man and Captain America; physics has General Relativity and Quantum Physics; and Bayesian stats has Posterior Estimation and… Bayes Factors!
A few months ago, I had the pleasure of hosting EJ Wagenmakers, to talk about these topics. This time, I’m talking with Jorge Tendeiro, who has a different perspective on Null Hypothesis Testing in the Bayesian framework, and its relationship with generative models and posterior estimation.
But this is not your classic, click-baity podcast, and I’m not interested in pitching people against each other. Instead, you’ll hear Jorge talk about the other perspective fairly, before even giving his take on the topic. Jorge will also tell us about the difficulty of arguing through papers, and all the nuances you lose compared to casual discussions.
But who is Jorge Tendeiro? He is a professor at Hiroshima University in Japan, and he was recommended to me by Pablo Bernabeu, a listener of this very podcast.
Before moving to Japan, Jorge studied math and applied stats at the University of Porto, and did his PhD in the Netherlands. He focuses on item response theory (specifically person fit analysis), and, of course, Bayesian statistics, mostly Bayes factors.
He’s also passionate about privacy issues in the 21st century, an avid Linux user since 2006, and is trying to get the hang of the Japanese language.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, David Haas and Robert Yolken.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Hosting someone like Kevin Murphy on your podcast is… complicated. Not because Kevin himself is complicated (he’s delightful, don’t make me say what I didn’t say!), but because all the questions I had for him amounted to a 12-hour show.
Sooooo, brace yourselves folks!
No, I'm kidding. Of course I didn’t do that folks, Kevin has a life! This life started in Ireland, where he was born. He grew up in England and got his BA from the University of Cambridge. After his PhD at UC Berkeley, he did a postdoc at MIT, and was an associate professor of computer science and statistics at the University of British Columbia in Vancouver, Canada, from 2004 to 2012. After getting tenure, he went to Google in California in 2011 on his sabbatical and then ended up staying.
He currently runs a team of about 8 researchers inside of Google Brain working on generative models, optimization, and other, as Kevin puts it, “basic” research topics in AI/ML. He has published over 125 papers in refereed conferences and journals, as well 3 textbooks on machine learning published in 2012, 2022 and the last one coming in 2023. You may be familiar with his 2012 book, as it was awarded the DeGroot Prize for best book in the field of statistical science.
Outside of work, Kevin enjoys traveling, outdoor sports (especially tennis, snowboarding and scuba diving), as well as reading, cooking, and spending time with his family.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Lin Yu Sha, Scott Anthony Robson, David Haas and Robert Yolken.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Is there life in the Universe? It doesn’t get deeper than this, does it? And yet, why do we care about that? In the very small chance that there is other life in the Universe, we have even less chance to discover it, talk to it and meet it. So, why do we care?
Well, it may surprise you but Bayesian statistics helps us think about these astronomical and — dare I say? — philosophical topics, as my guest, David Kipping, will brilliantly explain in this episode.
David is an Associate Professor of Astronomy at Columbia University, where he leads the Cool Worlds Lab — I know, the name is awesome. His team’s research spans exoplanet discovery and characterization, the search for life in the Universe and developing novel approaches to our exploration of the cosmos.
David also teaches astrostatistics, and his contributions to Bayesian statistics span astrobiology to exoplanet detection. He also hosts the Cool Worlds YouTube channel, with over half a million subscribers, that discusses his team’s work and broader topics within the field.
Cool worlds, cool guest, cool episode.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Lin Yu Sha, Scott Anthony Robson, David Haas and Robert Yolken.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I have to confess something: I love challenges. And when you’re a podcaster, what’s a better challenge than dedicating an episode to… visualization? Impossible you say? Well, challenge accepted!
Thankfully, I got the help of a visualization Avenger for this episode — namely, Matthew Kay. Matt is an Assistant Professor jointly appointed in Computer Science and Communications Studies at Northwestern University, where he co-directs the Midwest Uncertainty Collective — I know, it’s a pretty cool name for a lab.
He works in human-computer interaction and information visualization, and especially in uncertainty visualization. He also builds tools to support uncertainty visualization in R. In particular, he’s the author of the tidybayes and ggdist R packages, and wrote the random variable interface in the posterior package.
I promise, you won’t be uncertain about the importance of uncertainty visualization after that…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Lin Yu Sha and Scott Anthony Robson.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Folks, there are some new cool kids on the block. They are called PyMC, Aeppl, and Aesara, and it’s high time we give us a proper welcome!
To do that, who better than one of the architects of the new PyMC 4.0 — Ricardo Vieira! In this episode, he’ll walk us through the inner workings of the newly released version of PyMC, telling us why the Aesara backend and the brand new RandomVariable operators constitute such strong foundations for your beloved PyMC models. He will also tell us about a self-contained PPL project called Aeppl, dedicated to converting model graphs to probability functions — pretty cool, right?
Oh, in case you didn’t guess yet, Ricardo is a PyMC developer and data scientist at PyMC Labs. He spent several years teaching himself Statistics and Computer Science at the expense of his official degrees in Psychology and Neuroscience.
So, get ready for efficient random generator functions, better probability evaluation functions, and a fully-fledged modern Bayesian workflow!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Lin Yu Sha and Scott Anthony Robson.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
I’m sure you’ve already heard of gravitational waves, because my listeners are the coolest and smartest ever ;) But did you know about gravity waves? That’s right, waves in the sky due to gravity — sounds awesome, right?
Well, I’m pretty sure that Laura Mansfield will confirm your prior. Currently a postdoc at Stanford University, Laura studies — guess what? — gravity waves and how they are represented in climate models. In particular, she uses Bayesian methods to estimate the uncertainty on the gravity wave components of the models.
Holding a PhD from the University of Reading in the UK, her background is in atmospheric physics, but she’s interested in climate change and environmental issues.
So seat back, chill out, and enjoy this physics-packed, aerial episode!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Lin Yu Sha and Scott Anthony Robson.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
Inviting someone like Luciano Paz on a stats podcast is both a pleasure and a challenge — he does so many things brilliantly that you have too many questions to ask him…
In this episode, I’ve chosen — not without difficulty — to focus on the applications of Bayesian stats in the marketing industry, especially Media Mix Models. Ok, I also asked Luciano about other topics — but you know me, I like to talk…
Originally, Luciano studied physics. He then did a PhD and postdoc in neuroscience, before transitioning into industry. During his time in academia, he used stats, machine learning and data science concepts here and there, but not in a very organized way.
But at the end of his postdoc, he got into PyMC — and that’s when everything changed… He loved the community and decided to hop on board to exit academia into a better life. After leaving academia, he worked at a company that wanted to do data science but that, for privacy reasons, didn’t have a lot of data. And now, Luciano is one of the folks working full time at the PyMC Labs consultancy.
But Luciano is not only one of the cool nerds building this crazy Bayesian adventures. He also did a lot of piano and ninjutsu. Sooooo, don’t provoke him — either in the streets or at a karaoke bar…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh and Lin Yu Sha.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!
We talk a lot about generative modeling on this podcast — at least since episode 6, with Michael Betancourt! And an area where this way of modeling is particularly useful is healthcare, as Maria Skoularidou will tell us in this episode.
Maria is a final year PhD student at the University of Cambridge. Her thesis is focused on probabilistic machine learning and, more precisely, towards using generative modeling in… you guessed it: healthcare!
But her fields of interest are diverse: from theory and methodology of machine intelligence to Bayesian inference; from theoretical computer science to information theory — Maria is knowledgeable in a lot of topics! That’s why I also had to ask her about mixture models, a category of models that she uses frequently.
Prior to her PhD, Maria studied Computer Science and Statistical Science at Athens University of Economics and Business. She’s also invested in several efforts to bring more diversity and accessibility in the data science world.
When she’s not working on all this, you’ll find her playing the ney, trekking or rawing.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton and Jeannine Sue.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
The big problems with classic hypothesis testing are well-known. And yet, a huge majority of statistical analyses are still conducted this way. Why is it? Why are things so hard to change? Can you even do (and should you do) hypothesis testing in the Bayesian framework?
I guess if you wanted to name this episode in a very Marvelian way, it would be “Bayes factors against the p-values of madness” — but we won’t do that, it wouldn’t be appropriate, would it?
Anyways, in this episode, I’ll talk about all these very light and consensual topics with Eric-Jan Wagenmakers, a professor at the Psychological Methods Unit of the University of Amsterdam.
For almost two decades, EJ has staunchly advocated the use of Bayesian inference in psychology. In order to lower the bar for the adoption of Bayesian methods, he is coordinating the development of JASP, an open-source software program that allows practitioners to conduct state-of-the-art Bayesian analyses with their mouse — the one from the computer, not the one from Disney.
EJ has also written a children’s book on Bayesian inference with the title “Bayesian thinking for toddlers”. Rumor has it that he is also working on a multi-volume series for adults — but shhh, that’s a secret!
EJ’s lab publishes regularly on a host of Bayesian topics, so check out his website, particularly when you are interested in Bayesian hypothesis testing. The same goes for his blog by the way, “BayesianSpectacles”.
Wait, what’s that? EJ is telling me that he plays chess, squash, and that, most importantly, he enjoys watching arm wrestling videos on YouTube — yet another proof that, yes, you can find everything on YouTube.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland and Aubrey Clayton.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Why do we, humans, communicate? And how? And isn’t that a problem that to study communication we have to… communicate?
Did you ever ask yourself that? Because J.P. de Ruiter did — and does everyday. But he’s got good reasons: JP is a cognitive scientist whose primary research focus is on the cognitive foundations of human communication. He aims to improve our understanding of how humans and artificial agents use language, gesture and other types of signals to effectively communicate with each other.
Currently he has one of the two Bridge Professorship at Tufts University, and has been appointed in both the Computer Science and Psychology departments.
In this episode, we’ll look at why Bayes is helpful in dialogue research, what the future of the field looks like to JP, and how he uses PyMC in his own teaching.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland and Aubrey Clayton.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
In large-scale one-off civil infrastructure, decision-making under uncertainty is part of the job, that’s just how it is. But, civil engineers don't get the luxury of building 10^6 versions of the bridge, offshore wind turbine or aeronautical structure to consider a relative frequency interpretation!
And as you’ll hear, challenges don’t stop there: you also have to consider natural hazards such as earthquakes, rockfall and typhoons — in case you were wondering, civil engineering is not among the boring jobs!
To talk about these original topics, I had the pleasure to host Michael Faber. Michael is a Professor at the Department of Built Environment at Aalborg University, Denmark, the President of the Joint Committee on Structural Safety and is a tremendously deep thinker on the Bayesian interpretation of probability as it pertains to the risk-informed management of big infrastructure.
His research interests are directed on governance and management of risks, resilience and sustainability in the built environment — doing all that with Bayesian probabilistic modeling and applied Bayesian decision analysis, as you’ll hear.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland and Aubrey Clayton.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
You know when you have friends who wrote a book and pressure you to come on your podcast? That’s super annoying, right?
Well that’s not what happened with Ravin Kumar, Osvaldo Martin and Junpeng Lao — I was the one who suggested doing a special episode about their new book, Bayesian Modeling and Computation in Python. And since they cannot say no to my soothing French accent, well, they didn’t say no…
All of them were on the podcast already, so I’ll refer you to their solo episode for background on their background — aka backgroundception.
Junpeng is a Data Scientist at Google, living in Zurich, Switzerland. Previously, he was a post-doc in Psychology and Cognitive Neuroscience. His current obsessions are time series and state space models.
Osvaldo is a Researcher at CONICET in Argentina and the Department of Computer Science from Aalto University in Finland. He is especially motivated by the development and implementation of software tools for Bayesian statistics and probabilistic modeling.
Ravin is a data scientist at Google, living in Los Angeles. Previously he worked at Sweetgreen and SpaceX. He became interested in Bayesian statistics when trying to quantify uncertainty in operations. He is especially interested in decision science in business settings.
You’ll make your own opinion, but I like their book because uses a hands-on approach, focusing on the practice of applied statistics. And you get to see how to use diverse libraries, like PyMC, Tensorflow Probability, ArviZ, Bambi, and so on. You’ll see what I’m talking about in this episode.
To top it off, the book is fully available online at bayesiancomputationbook.com. If you want a physical copy (because you love those guys and wanna support them), go to CRC website and enter the code ASA18 at checkout for a 30% discount.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland and Aubrey Clayton.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
No, no, don't leave! You did not click on the wrong button. You are indeed on Alex Andorra’s podcast. The podcast that took the Bayesian world by a storm: “Learning Bayesian Statistics”, and that Barack Obama deemed “the best podcast in the whole galaxy” – or maybe Alex said that, I don’t remember.
Alex made us discover new methods, new ideas, and mostly new people. But what do we really know about him? Does he even really exist? To find this out I put on my Frenchest beret, a baguette under my arm, and went undercover to try to find him.
And I did ! So today for a special episode I, Rémi Louf, will be the one asking questions and making bad jokes with a French accent.
Before letting him in, here’s what I got on him so far.
By day, Alex is a Bayesian modeler at the PyMC Labs consultancy. By night, he doesn’t (yet) fight crime but he’s an open-source enthusiast and core contributor to PyMC and ArviZ.
An always-learning statistician, Alex loves building models and studying elections and human behavior.
When he’s not working, he loves hiking, exercising, meditating and reading nerdy books and novels. He also loves chocolate a bit too much, but he doesn’t like talking about it – he prefers eating it.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland and Aubrey Clayton.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Did you know there is a relationship between the size of firetrucks and the amount of damage down to a flat during a fire? The bigger the truck sent to put out the fire, the bigger the damages tend to be. The solution is simple: just send smaller firetrucks!
Wait, that doesn’t sound right, does it? Our brain is a huge causal machine, so it can instinctively feel it’s not credible that size of truck and amount of damage done are causally related: there must be another variable explaining the correlation. Here, it’s of course the seriousness of the fire — even better, it’s the common cause of the two correlated variables.
Your brain does that automatically, but what about your computer? How do you make sure it doesn’t just happily (and mistakenly) report the correlation? That’s when causal inference and machine learning enter the stage, as Robert Osazuwa Ness will tell us.
Robert has a PhD in statistics from Purdue University. He currently works as a Research Scientist at Microsoft Research and a founder of altdeep.ai, which teaches live cohort-based courses on advanced topics in applied modeling.
As you’ll hear, his research focuses on the intersection of causal and probabilistic machine learning. Maybe that’s why I invited him on the show… Well, who knows, causal inference is very hard!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland and Aubrey Clayton.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
What’s the common point between fiction, fake news, illusions and meditation? They can all be studied with Bayesian statistics, of course!
In this mind-bending episode, Dominique Makowski will for sure expand your horizon. Trained as a clinical neuropsychologist, he is currently working as a postdoc at the Clinical Brain Lab in Singapore, in which he leads the Reality Bending Team. What’s reality-bending you ask? Well, you’ll have to listen to the episode, but I can already tell you we’ll go through a journey in scientific methodology, history of art, religion, and philosophy — what else?
Beyond that, Dominique tries to improve the access to advanced analysis techniques by developing open-source software and tools, like the NeuroKit Python package or the bayestestR package in R.
Even better, he looks a lot like his figures of reference. Like Marcus Aurelius, he plays the piano and guitar. Like Sisyphus, he loves history of art and comparative mythology. And like Yoda, he is a wakeboard master.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Daniel Lindroth, Yoshiyuki Hamajima, Sven De Maeyer and Michael DeCrescenzo.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Let’s be honest: evolution is awesome! I started reading Improbable Destinies: Fate, Chance, and the Future of Evolution, by Jonathan Losos, and I’m utterly fascinated.
So I’m thrilled to welcome Florian Hartig on the show. Florian is a professor of Theoretical Ecology at the University of Regensburg, Germany. His research concentrates on theory, computer simulations, statistical methods and machine learning in ecology & evolution. He is also interested in open science and open software development, and maintains, among other projects, the R packages DHARMa and BayesianTools.
Among other things, we talked about approximate Bayesian computation, best practices when building models and the big pain points that remain in the Bayesian pipeline.
Most importantly, Florian’s main hobbies are whitewater kayaking, snowboarding, badminton and playing the guitar.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones and Daniel Lindroth.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Get a 30% discount on Todd's book by entering the code BDABNS22 at checkout!
The behavioral and neural sciences are a nerdy interest of mine, but I didn’t dedicate any episode to that topic yet. But life brings you gifts sometimes (especially around Christmas…), and here that gift is a book, Bayesian Data Analysis for the Behavioral and Neural Sciences, by Todd Hudson.
Todd is a part of the faculty at New York University Grossman School of Medicine and also the New York University Tandon School of Engineering. He is a computational neuroscientist working in several areas including: early detection and grading of neurological disease; computational models of movement planning and learning; development of new computational and experimental techniques.
He also co-founded Tactile Navigation Tools, which develops navigation aids for the visually impaired, and Third Eye Technologies, which develops low cost laboratory- and clinical-grade eyetracking technologies.
As you’ll hear, Todd wanted his book to bypass the need for advanced mathematics normally considered a prerequisite for this type of material. Basically, he wants students to be able to write code and models and understand equations, even they are not specialized in writing those equations.
We’ll also touch on some of the neural sciences examples he’s got in the book, as well as the two general algorithms he uses for model measurement and model selection.
Ow, I almost forgot the most important: Todd loves beekeeping and gardening — he’s got 25 apple trees, 4 cherry trees, nectarines, figs, strawberries, etc!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Alejandro Morales, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones and Daniel Lindroth.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Did I mention I like survey data, especially in the context of electoral forecasting? Probably not, as I’m a pretty shy and reserved man. Why are you laughing?? Yeah, that’s true, I’m not that shy… but I did mention my interest for electoral forecasting already!
And before doing a full episode where I’ll talk about French elections (yes, that’ll come at one point), let’s talk about one of France’s neighbors — Germany. Our German friends had federal elections a few weeks ago — consequential elections, since they had the hard task of replacing Angela Merkel, after 16 years in power.
To talk about this election, I invited Marcus Gross on the show, because he worked on a Bayesian forecasting model to try and predict the results of this election — who will get elected as Chancellor, by how much and with which coalition?
I was delighted to ask him about how the model works, how it accounts for the different sources of uncertainty — be it polling errors, unexpected turnout or media events — and, of course, how long it takes to sample (I think you’ll be surprised by the answer).
We also talked about the other challenge of this kind of work: communication — how do you communicate uncertainty effectively? How do you differentiate motivated reasoning from useful feedback? What were the most common misconceptions about the model?
Marcus studied statistics in Munich and Berlin, and did a PhD on survey statistics and measurement error models in economics and archeology. He worked as a data scientist at INWT, a consulting firm with projects in different business fields as well as the public sector. Now, he is working at FlixMobility.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Alejandro Morales, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King and Aaron Jones.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
You know I love epistemology — the study of how we know what we know. It was high time I dedicated a whole episode to this topic. And what better guest than Aubrey Clayton, the author of the book Bernoulli's Fallacy: Statistical Illogic and the Crisis of Modern Science. I’m in the middle of reading it, and it’s a really great read!
Aubrey is a mathematician in Boston who teaches the philosophy of probability and statistics at the Harvard Extension School. He holds a PhD in mathematics from the University of California, Berkeley, and his writing has appeared in Pacific Standard, Nautilus, and the Boston Globe.
We talked about what he deems “a catastrophic error in the logic of the standard statistical methods in almost all the sciences” and why this error manifests even outside of science, like in medicine, law, public policy, etc.
But don’t worry, we’re not doomed — we’ll also see where we go from there. As a big fan of E.T Jaynes, Aubrey will also tell us how this US scientist influenced his own thinking as well as the field of Bayesian inference in general.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Alejandro Morales, Tomáš Frýda, Ryan Wesslen and Andreas Netti.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Folks, this is the 50th episode of LBS — 50th! I never would have thought that there were so many Bayesian nerds in the world when I first interviewed Osvaldo Martin more than 2 years ago.
To celebrate that random, crazy adventure, I wanted to do a special episode at any random point, and so it looks like it’s gonna be #50! This episode is special by its guest, not its number — although my guest knows a thing or two about numbers. Most recently, he wrote the book Covid by Numbers.
A mathematical statistician dedicated to helping the general public understand risk, uncertainty and decision-making, he’s the author of several books on the topic actually, including The Art of Statistics. You may also know him from his podcast, Risky Talk, or his numerous appearances in newspapers, radio and TV shows.
Did you guess who it is?
Maybe you just know him as the reigning World Champion in Loop – a version of pool played on an elliptical table – and are just discovering now that he is a fantastic science communicator – something that turns out to be especially important for stats education in times of, let’s say, global pandemic for instance.
He holds a PhD in Mathematical Statistics from the University of London and has been the Chair of the Winton Centre for Risk and Evidence Communication at Cambridge University since 2016. He was also the President of the famous Royal Statistical Society in 2017-2018.
Most importantly, he was featured in BBC1’s Winter Wipeout in 2011 – seriously, go check it out on his website; it’s hilarious.
So did you guess it yet? Yep, my guest for this episode is no other than Sir David Spiegelhalter — yes, there are Bayesian knights!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Alejandro Morales and Tomáš Frýda.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
It’s been a while since I did an episode about sports analytics, right? And you know it’s a field I love, so… let’s do that!
For this episode, I was happy to host Ehsan Bokhari, not only because he’s a first-hour listener of the podcast and spread the word about it whenever he can, but mainly because he knows baseball analytics very well!
Currently Senior Director of Strategic Decision Making with the Houston Astros, he previously worked there as Senior Director of Player Evaluation and Director of R&D. And before that, he was Senior Director at the Los Angeles Dodgers from the 2015 to the 2018 season.
Among other things, we talked about what his job looks like, how Bayesian the field is, which pushbacks he gets, and what the future of baseball analytics look like to him.
Ehsan also has an interesting background, coming from both psychology and mathematics. Indeed, he received a PhD in quantitative psychology and an MS in statistics at the University of Illinois in 2014.
Maybe most importantly, he loves reading non-fiction and spending time with his almost three-year-old son — who he read Bayesian Probability for Babies to, of course.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, and Alejandro Morales.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
In episode 40, we already got a glimpse of how useful Bayesian stats are in the speech and communication sciences. To talk about the frontiers of this field (and, as it happens, about best practices to make beautiful plots and pictures), I invited TJ Mahr on the show.
A speech pathologist turned data scientist, TJ earned his PhD in communication sciences and disorders in Madison, Wisconsin. On paper, he was studying speech development, word recognition and word learning in preschoolers, but over the course of his graduate training, he discovered that he really, really likes programming and working with data – we’ll of course talk about that in the show!
In short, TJ wrangles data, crunches numbers, plots pictures, and fits models to study how children learn to speak and communicate. On his website, he often writes about Bayesian models, mixed effects models, functional programming in R, or how to plot certain kinds of data.
He also got very into the deck-building game “Slay the Spire” this year, and his favorite youtube channel is a guy who restores paintings.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, and Luis Iberico.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
The field of physics has brought tremendous advances to modern Bayesian statistics, especially inspiring the current algorithms enabling all of us to enjoy the Bayesian power on our own laptops.
I did receive some physicians already on the show, like Michael Betancourt in episode 6, but in my legendary ungratefulness I hadn’t dedicated a whole episode to talk about physics yet.
Well that’s now taken care of, thanks to JJ Ruby. Apart from having really good tastes (he’s indeed a fan of this very podcast), JJ is currently a postdoctoral fellow for the Center for Matter at Atomic Pressures at the University of Rochester, and will soon be starting as a Postdoctoral Scholar at Lawrence Livermore National Laboratory, a U.S. Department of Energy National Laboratory.
JJ did his undergraduate work in Astrophysics and Planetary Science at Villanova University, outside of Philadelphia, and completed his master’s degree and PhD in Physics at the University of Rochester, in New York.
JJ studies high energy density physics and focuses on using Bayesian techniques to extract information from large scale physics experiments with highly integrated measurements.
In his freetime, he enjoys playing sports including baseball, basketball, and golf.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin and Cameron Smith.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Therefore we think that in the future, workers in all the quantitative sciences will be obliged, as a matter of practical necessity, to use probability theory in the manner expounded here. This trend is already well under way in several fields, ranging from econometrics to astronomy to magnetic resonance spectroscopy; but to make progress in a new area it is necessary to develop a healthy disrespect for tradition and authority, which have retarded progress throughout the 20th century.
You wanna know something funny? A sentence from this episode became a meme. And people even made stickers out of it! Ok, that’s not true. But if someone could pull off something like that, it would surely be Chelsea Parlett-Pelleriti.
Indeed, Chelsea’s research focuses on using statistics and machine learning on behavioral data, but her more general goal is to empower people to be able to do their own statistical analyses, through consulting, education, and, as you may have seen, stats memes on Twitter.
A full-time teacher, researcher and statistical consultant, Chelsea earned an MsC and PhD in Computational and Data Science in 2021 from Chapman University. Her courses include R, intro to programming (in Python), and data science.
In a nutshell, Chelsea is, by her own admission, an avid lover of anything silly or statistical. Hopefully, this episode turned out to be both at once! I’ll let you be the judge of that…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin and Philippe Labonde.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
As a podcaster, I discovered that there are guests for which the hardest is to know when to stop the conversation. They could talk for hours and that would make for at least 10 fantastic episodes. Frank Harrell is one of those guests. To me, our conversation was both fascinating — thanks to Frank’s expertise and the width and depth of topics we touched on — and frustrating — I still had a gazillion questions for him!
But rest assured, we talked about intent to treat and randomization, proportional odds, clinical trial design, bio stats and covid19, and even which mistakes you should do to learn Bayes stats — yes, you heard right, which mistakes. Anyway, I can’t tell you everything here — you’ll just have to listen to the episode!
A long time Bayesian, Frank is a Professor of Biostatistics in the School of Medicine at Vanderbilt University. His numerous research interests include predictive models and model validation, Bayesian clinical trial design and Bayesian models, drug development, and clinical research.
He holds a PhD in biostatistics from the University of North Carolina, and did his Bachelor in mathematics at the University of Alabama in Birmingham.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin and Philippe Labonde.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Paperpile: paperpile.com
Get 20% off until December 31st with promo code GOODBAYESIAN21
Bonjour my dear Bayesians! Yes, it was bound to happen one day — and this day has finally come. Here is the first ever 100% French speaking ‘Learn Bayes Stats’ episode! Who is to blame, you ask? Well, who better than Rémi Louf?
Rémi currently works as a senior data scientist at Ampersand, a big media marketing company in the US. He is the author and maintainer of several open source libraries, including MCX and BlackJAX. He holds a PhD in statistical Physics, a Masters in physics from the Ecole Normale Supérieure and a Masters in Philosophy from Oxford University.
I think I know what you’re wondering: how the hell do you go from physics to philosophy to Bayesian stats?? Glad you asked, as it was my first question to Rémi! He’ll also tell us why he created MXC and BlackJax, what his main challenges are when working on open-source projects, and what the future of PPLs looks like to him.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin and Philippe Labonde.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Paperpile: paperpile.com
Get 20% off until December 31st with promo code GOODBAYESIAN21
I don’t know if you’ve heard, but there is a virus that took over most of the world in the past year? I haven’t dedicated any episode to Covid yet. First because research was moving a lot — and fast. And second because modeling Covid is very, very hard.
But we know more about it now, so I thought it was a good time to pause and ponder — how does the virus circulate? How can we model it and, ultimately, defeat it? What are the challenges in doing so?
To talk about that, I had the chance to host Michael Osthege and Thomas Vladeck, who both were part of the team who developed the Rt-live model, a Bayesian model to infer the reproductive rate of Covid19 in the general population. As you’ll hear, modeling the evolution of this virus is challenging, fascinating, and a perfect fit for Bayesian modeling! It truly is a wonderful example of Bayesian generative modeling.
Tom is the Managing Director of Gradient Metrics, a quantitative market research firm, and a Co-Founder of Recast, a media mix model for modern brands.
Michael is a PhD student in laboratory automation and bioprocess optimization at the Forschungszentrum Jülich in Germany, and a fellow PyMC core-developer. As he works a lot on the coming brand new version 4, we’ll take this opportunity to talk about the current developments and where the project is headed.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode and Patrick Kelley.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Paperpile: paperpile.com
Get 20% off until December 31st with promo code GOODBAYESIAN21
We often talk about applying Bayesian statistics on this podcast. But how do we teach them? What’s the best way to introduce them from a young age and make sure the skills students learn in the stats class are transferable?
Well, lucky us, Mine Dogucu’s research tackles precisely those topics!
An Assistant Professor of Teaching in the Department of Statistics at University of California Irvine, Mine is both an educator with an interest in statistics, and an applied statistician with experience in educational research.
Her work focuses on modern pedagogical approaches in the statistics curriculum, making data science education more accessible. In particular, she teaches an undergraduate Bayesian course, and is the coauthor of the upcoming book Bayes Rules! An Introduction to Bayesian Modeling with R.
In other words, Mine is not only interested in teaching, but also in how best to teach statistics – how to engage students in remote classes, how to get to know them, how to best record and edit remote courses, etc. She writes about these topics on her blog, DataPedagogy.com.
She also works on accessibility and inclusion, as well as a study that investigates how popular Bayesian courses are at the undergraduate level in the US — that should be fun to talk about!
Mine did her Master’s at Bogazici University in Istanbul, Turkey, and then her PhD in Quantitative Research, Evaluation, and Measurement at Ohio State University.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, John Johnson, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode and Patrick Kelley.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Let’s think Bayes, shall we? And who better to do that than the author of the well known book, Think Bayes — Allen Downey himself! Since the second edition was just released, the timing couldn’t be better!
Allen is a professor at Olin College and the author of books related to software and data science, including Think Python, Think Bayes, and Think Complexity. His blog, Probably Overthinking It, features articles on Bayesian probability and statistics. He holds a Ph.D. from U.C. Berkeley, and bachelors and masters degrees from MIT.
In this special episode, Allen and I talked about his background, how he came to the stats and teaching worlds, and why he wanted to write this book in the first place. He’ll tell us who this book is written for, what’s new in the second edition, and which mistakes his students most commonly make when starting to learn Bayesian stats. We also talked about some types of models, their usefulness and their weaknesses, but I’ll let you discover that.
Now for another good news: 5 Patrons of the show will get Think Bayes for free! To qualify, you just need to go the form I linked to in the 'Learn Bayes Stats' Slack channel or the Patreon page and enter your email address. That’s it. After a week or so, Allen and I will choose 5 winners at random, who will receive the book for free!
If you’re not a Patron yet, make sure to check out patreon.com/learnbayesstats if you don’t want to miss out on these goodies!
And even if you’re not a Patron, I love you dear listeners, so you all get a discount when you go buy the book at https://www.learnbayesstats.com/buy-think-bayes (unfortunately, this only applies for purchases in the US and Canada).
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt, Andrew Moskowitz, John Johnson and Hector Munoz.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
We all know about these accidental discoveries — penicillin, the heating power of microwaves, or the famous (and delicious) tarte tatin. I don’t know why, but I just love serendipity. And, as you’ll hear, this episode is deliciously full of it…
Thanks to Allison Hilger and Timo Roettger, we’ll discover the world of linguistics, how Bayesian stats are helpful there, and how Paul Bürkner’s BRMS package has been instrumental in this field. To my surprise — and perhaps yours — the speech and language sciences are pretty quantitative and computational!
As she recently discovered Bayesian stats, Allison will also tell us about the challenges she’s faced from advisors and reviewers during her PhD at Northwestern University, and the advice she’d have for people in the same situation.
Allison is now an Assistant Professor at the University of Colorado Boulder. The overall goal in her research is to improve our understanding of motor speech control processes, in order to inform effective speech therapy treatments for improved speech naturalness and intelligibility. Allison also worked clinically as a speech-language pathologist in Chicago for a year. As a new Colorado resident, her new hobbies include hiking, skiing, and biking — and then reading or going to dog parks when she’s to tired.
Holding a PhD in linguistics from the University of Cologne, Germany, Timo is an Associate Professor for linguistics at the University of Oslo, Norway. Timo tries to understand how people communicate their intentions using speech – how are speech signals retrieved; how do people learn and generalize? Timo is also committed to improving methodologies across the language sciences in light of the replication crisis, with a strong emphasis on open science.
Most importantly, Timo loves hiking, watching movies or, even better, watching people play video games!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt and Andrew Moskowitz.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Tidelift: tidelift.com
It’s been a while since we talked about biostatistics and bioinformatics on this podcast, so I thought it could be interesting to talk to Jacki Buros — and that was a very good idea!
She’ll walk us through examples of Bayesian models she uses to, for instance, work on biomarker discovery for cancer immunotherapies. She’ll also introduce you to survival models — their usefulness, their powers and their challenges.
Interestingly, all of this will highlight a handful of skills that Jacki would try to instill in her students if she had to teach Bayesian methods.
The Head of Data and Analytics at Generable, a state-of-the-art Bayesian platform for oncology clinical trials, Jacki has been working in biostatistics and bioinformatics for over 15 years. She started in cardiology research at the TIMI Study Group at Harvard Medical School before working in Alzheimer’s Disease genetics at Boston University and in biomarker discovery for cancer immunotherapies at the Hammer Lab. Most recently she was the Lead Biostatistician at the Institute for Next Generation Health Care at Mount Sinai.
An open-source enthusiast, Jacki is also a contributor to Stan and rstanarm, and the author of the survivalstan package, a library of Stan models for survival analysis.
Last but not least, Jacki is an avid sailor and skier!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Tim Radtke, Adam C. Smith, Will Kurt and Andrew Moskowitz.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Tidelift: tidelift.com
Imagine me rapping: "Let me show you how to be a good Bayesian. Change your predictions after taking information in, and if you’re thinking I’ll be less than amazing, let’s adjust those expectations!"
What?? Nah, you’re right, I’m not as good as Baba Brinkman. Actually, the best to perform « Good Bayesian » live on the podcast would just be to invite him for an episode… Wait, isn’t that what I did???
Well indeed! For this episode, I had the great pleasure of hosting rap artist, science communicator and revered author of « Good Bayesian », Baba Brinkman!
We talked about his passion for oral poetry, his rap career, what being a good rapper means and the difficulties he encounters to establish himself as a proper rapper.
Baba began his rap career in 1998, freestyling and writing songs in his hometown of Vancouver, Canada.
In 2000 he started adapting Chaucer’s Canterbury Tales into original rap compositions, and in 2004 he premiered a one man show based on his Master’s thesis, The Rap Canterbury Tales, exploring parallels between hip-hop music and medieval poetry.
Over the years, Baba went on to create “Rap Guides” dedicated to scientific topics, like evolution, consciousness, medicine, religion, and climate change – and I encourage you to give them all a listen!
By the way, do you know the common point between rap and evolutionary biology? Well, you’ll have to tune in for the answer… And make sure you listen until the end: Baba has a very, very nice surprise for you!
A little tip: if you wanna enjoy it to the fullest, I put the unedited video version of this interview in the show notes ;) By the way, let me know if you like these video live streams — I might just do them again if you do!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski and Tim Radtke.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Tidelift: tidelift.com
I don’t know about you, but the notion of time is really intriguing to me: it’s a purely artificial notion; we humans invented it — as an experiment, I asked my cat what time it was one day; needless to say it wasn’t very conclusive… And yet, the notion of time is so central to our lives — our work, leisures and projects depend on it.
So much so that time series predictions represent a big part of the statistics and machine learning world. And to talk about all that, who better than a time master, namely Sean Taylor?
Sean is a co-creator of the Prophet time series package, available in R and Python. He’s a social scientist and statistician specialized in methods for solving causal inference and business decision problems. Sean is particularly interested in building tools for practitioners working on real-world problems, and likes to hang out with people from many fields — computer scientists, economists, political scientists, statisticians, machine learning researchers, business school scholars — although I guess he does that remotely these days…
Currently head of the Rideshare Labs team at Lyft, Sean was a research scientist and manager on Facebook’s Core Data Science Team and did a PhD in information systems at NYU’s Stern School of Business. He did his undergraduate at the University of Pennsylvania, studying economics, finance, and information systems. Last but not least, he grew up in Philadelphia, so, of course, he’s a huge Eagles fan! For my non US listeners, we’re talking about the football team here, not the bird!
We also talked about two of my favorite topics — science communication and epistemology — so I had a lot of fun talking with Sean, and I hope you’ll deem this episode a good investment of your time
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen and Raul Maldonado.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Tidelift: tidelift.com
I bet you already heard of Bayesian nonparametric models, at least on this very podcast. We already talked about Dirichlet Processes with Karin Knudson on episode 4, and then about Gaussian Processes with Elizaveta Semenova on episode 21. Now we’re gonna dive into the mathematical properties of these objects, to understand them better — because, as you may know, Bayesian nonparametrics are quite powerful but also very hard to fit!
Along the way, you’ll learn about probabilistic circuits, sum-product networks and — what a delight — you’ll hear from the Julia community! Indeed, my guest for this episode is no other than… Martin Trapp!
Martin is a core developer of Turing.jl, an open-source framework for probabilistic programming in Julia, and a post-doc in probabilistic machine learning at Aalto University, Finland.
Martin loves working on sum-product networks and Bayesian non-parametrics. And indeed, his research interests focus on probabilistic models that exploit structural properties to allow efficient and exact computations while maintaining the capability to model complex relationships in data. In other words, Martin’s research is focused on tractable probabilistic models.
Martin did his MsC in computational intelligence at the Vienna University of Technology and just finished his PhD in machine learning at the Graz University of Technology. He doesn’t only like to study the tractability of probabilistic models — he also is very found of climbing!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Jonathan Sedar, Hugo Botha, Vinh Nguyen and Raul Maldonado.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Tidelift: tidelift.com
One of the most common guest suggestions that you dear listeners make is… inviting Paul Bürkner on the show! Why? Because he’s a member of the Stan development team and he created BRMS, a popular R package to make and sample from Bayesian regression models using Stan. And, as I like you, I did invite Paul on the show and, well, that was a good call: we had an amazing conversation, spanning so many topics that I can’t list them all here!
I asked him why he created BRMS, in which fields it’s mostly used, what its weaknesses are, and which improvements to the package he’s currently working on. But that’s not it! Paul also gave his advice to people realizing that Bayesian methods would be useful to their research, but who fear facing challenges from advisors or reviewers.
Besides being a Bayesian rockstar, Paul is a statistician working as an Independent Junior Research Group Leader at the Cluster of Excellence SimTech at the University of Stuttgart, Germany. Previously, he has studied Psychology and Mathematics at the Universities of Münster and Hagen and did his PhD in Münster about optimal design and Bayesian data analysis, and he also worked as a Postdoctoral researcher at the Department of Computer Science at Aalto University, Finland.
So, of course, I asked him about the software-assisted Bayesian workflow that he’s currently working on with Aki Vehtari, which led us to no less than the future of probabilistic programming languages…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen and Jonathan Sedar.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
Episode sponsored by Tidelift: tidelift.com
We already mentioned multilevel regression and post-stratification (MRP, or Mister P) on this podcast, but we didn’t dedicate a full episode to explaining how it works, why it’s useful to deal with non-representative data, and what its limits are. Well, let’s do that now, shall we?
To that end, I had the delight to talk with Lauren Kennedy! Lauren is a lecturer in Business Analytics at Monash University in Melbourne, Australia, where she develops new statistical methods to analyze social science data. Working mainly with R and Stan, Lauren studies non-representative data, multilevel modeling, post-stratification, causal inference, and, more generally, how to make inferences from the social sciences.
Needless to say that I asked her everything I could about MRP, including how to choose priors, why her recent paper about structured priors can improve MRP, and when MRP is not useful. We also talked about missing data imputation, and how all these methods relate to causal inference in the social sciences.
If you want a bit of background, Lauren did her Undergraduates in Psychological Sciences and Maths and Computer Sciences at Adelaide University, with Danielle Navarro and Andrew Perfors, and then did her PhD with the same advisors. She spent 3 years in NYC with Andrew Gelman’s Lab at Columbia University, and then moved back to Melbourne in 2020. Most importantly, Lauren is an adept of crochet — she’s already on her third blanket!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege and Rémi Louf.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
How do people choose their career? How do they change jobs? How do they even change careers? These are important questions that we don’t have great answers to. But structured data about the dynamics of labor markets are starting to emerge, and that’s what Ben Zweig is modeling at Revelio Labs.
An economist and data scientist, Ben is indeed the CEO of Revelio Labs, a data science company analyzing raw labor data contained in resumes, online profiles and job postings. In this episode, he’ll tell us about the Bayesian structural time series model they built to estimate inflows and outflows from companies, using LinkedIn data — a very challenging but fascinating endeavor, as you’ll hear!
As a lot of people, Ben has always used more traditional statistical models but had been intrigued by Bayesian methods for a long time. When they started working on this Bayesian time series model though, he had to learn a bunch of new methods really quickly. I think you’ll find interesting to hear how it went…
Ben also teaches data science and econometrics at the NYU Stern school of business, so he’ll reflect on his experience teaching Bayesian methods to economics students. Prior to that, Ben did a PhD in economics at the City University of New York, and has done research in occupational transformation and social mobility.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege and Rémi Louf.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
When explaining Bayesian statistics to people who don’t know anything about stats, I often say that MCMC is about walking many different paths in lots of parallel universes, and then counting what happened in all these universes.
And in a sense, this whole podcast is dedicated to sampling the whole distribution of Bayesian practitioners. So, for this episode, I thought we’d take a break of pure, hard modeling and talk about how to get involved into Bayesian statistics and open-source development, how companies use Bayesian tools, and what common struggles and misperceptions the latter suffer from.
Quite the program, right? The good news is that Peadar Coyle, my guest for this episode, has done all of that! Coming to us from Armagh, Ireland, Peadar is a fellow PyMC core developer and was a data science and data engineer consultant until recently – a period during which he has covered all of modern startup data science, from AB testing to dashboards to data engineering to putting models into production.
From these experiences, Peadar has written a book consisting of numerous interviews with data scientists throughout the world – and do consider buying it, as money goes to the NumFOCUS organization, under which many Bayesian stats packages live, like Stan, ArviZ, PyMC, etc.
Now living in London, Peadar recently founded the start-up Aflorithmic, an AI solution that aims at developing personalized voice-first solutions for brands and enterprises. Their technology is also used to support children, families and elderly coping with the mental health challenges of COVID-19 confinements.
Before all that, Peadar studied physics, philosophy and mathematics at the universities of Bristol and Luxembourg. When he’s away from keyboard, he enjoys the outdoors, cooking and, of course, watching rugby!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll and Nathaniel Burbank.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
I don’t know if you noticed, but I have a fondness for any topic related to decision-making under uncertainty — when it’s studied scientifically of course. Understanding how and why people make decisions when they don’t have all the facts is fascinating to me. That’s why I like electoral forecasting and I love cognitive sciences.
So, for the first episode of 2021, I have a special treat: I had the great pleasure of hosting Michael Lee on the podcast! Yes, the Michael Lee who co-authored the book Bayesian Cognitive Modeling with Eric-Jan Wagenmakers in 2013 — by the way, the book was ported to PyMC3, I put the link in the show notes ;)
This book was inspired from Michael’s work as a professor of cognitive sciences at University of California, Irvine. He works a lot on representation, memory, learning, and decision making, with a special focus on individual differences and collective cognition.
Using naturally occurring behavioral data, he builds probabilistic generative models to try and answer hard real-world questions: how does memory impairment work (that’s modeled with multinomial processing trees)? How complex are simple decisions, and how do people change strategies?
Echoing episode 18 with Daniel Lakens, Michael and I also talked about the reproducibility crisis: how are cognitive sciences doing, which progress was made, and what is still to do?
Living now in California, Michael is originally from Australia, where he did his Bachelors of Psychology and Mathematics, and his PhD in psychology. But Michael is also found of the city of Amsterdam, which he sees as “the perfect antidote to southern California with old buildings, public transport, great bread and beer, and crappy weather”.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll and Nathaniel Burbank.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
It’s funny how powerful symbols are, right? The Eiffel Tower makes you think of Paris, the Statue of Liberty is New-York, and the Trevi Fountain… is Rome of course! Just with one symbol, you can invoke multiple concepts and ideas.
You probably know that symbols are omnipresent in mathematics — but did you know that they are also very important in statistics, especially probabilistic programming?
Rest assured, I didn’t really know either… until I talked with Brandon Willard! Brandon is indeed a big proponent of relational programming and symbolic computation, and he often promotes their use in research and industry. Actually, a few weeks after our recording, Brandon started spearheading the revival of Theano through the JAX backend that we’re currently working on for the future version of PyMC3!
As you guessed it, Brandon is a core developer of PyMC, and also a contributor to Airflow and IPython, just to name a few. His interests revolve around the means and methods of mathematical modeling and its automation. In a nutshell, he’s a Bayesian statistician: he likes to use the language and logic of probability to quantify uncertainty and frame problems.
After a Bachelor’s in physics and mathematics, Brandon got a Master’s degree in statistics from the University of Chicago. He’s worked in different areas in his career – from finance, transportation and energy to start-ups, gov-tech and academia. Brandon particularly loves projects where popular statistical libraries are inadequate, where sophisticated models must be combined in non-trivial ways, or when you have to deal with high-dimensional and discrete processes.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho and Colin Carroll.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
I’ll be honest here: I had a hard time summarizing this episode for you, and, let’s face it, it’s all my guest’s fault! Why? Because Aki Vehtari works on so many interesting projects that it’s hard to sum them all up, even more so because he was very generous with his time for this episode! But let’s try anyway, shall we?
So, Aki is an Associate professor in computational probabilistic modeling at Aalto University, Finland. You already heard his delightful Finnish accent on episode 20, with Andrew Gelman and Jennifer Hill, talking about their latest book, « Regression and other stories ». He is also a co-author of the popular and awarded book « Bayesian Data Analysis », Third Edition, and a core-developer of the seminal probabilistic programming framework Stan.
An enthusiast of open-source software, Aki is a core-contributor to the ArviZ package and has been involved in many free software projects such as GPstuff for Gaussian processes and ELFI for likelihood inference.
His numerous research interests are Bayesian probability theory and methodology, especially model assessment and selection, non-parametric models (such as Gaussian processes), feature selection, dynamic models, and hierarchical models.
We talked about all that — and more — on this episode, in the context of his teaching at Aalto and the software-assisted Bayesian workflow he’s currently working on with a group of researchers.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho and Colin Carroll.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
In times of crisis, designing an efficient policy response is paramount. In case of natural disasters or pandemics, it can even determine the difference between life and death for a substantial number of people. But precisely, how do you design such policy responses, making sure that risks are optimally shared, people feel safe enough to reveal necessary information, and stakeholders commit to the policies?
That’s where a field of economics, industrial organization (IO), can help, as Shosh Vasserman will tell us in this episode. Shosh is an assistant professor of economics at the Stanford Graduate School of Business. Specialized in industrial organization, her interests span a number of policy settings, such as public procurement, pharmaceutical pricing and auto-insurance.
Her work leverages theory, empirics and modern computation (including the Stan software!) to better understand the equilibrium implications of policies and proposals involving information revelation, risk sharing and commitment.
In short, Shoshana uses theory and data to study how risk, commitment and information flows interplay with policy design. And she does a lot of this with… Bayesian models! Who said Bayes had no place in economics?
Prior to Stanford, Shoshana did her Bachelor’s in mathematics and economics at MIT, and then her PhD in economics at Harvard University.
This was a fascinating conversation where I learned a lot about Bayesian inference on large scale random utility logit models, socioeconomic network heterogeneity and pandemic policy response — and I’m sure you will too!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
In a few days, a consequential election will take place, as citizens of the United States will go to the polls and elect their president — in fact they already started voting. You probably know a few forecasting models that try to predict what will happen on Election Day — who will get elected, by how much and with which coalition of States?
But how do these statistical models work? How do you account for the different sources of uncertainty, be it polling errors, unexpected turnout or media events? How do you model covariation between States? How do you even communicate the model’s results and afterwards assess its performance? To talk about all this, I had the pleasure to talk to Andrew Gelman and Merlin Heidemanns.
Andrew was already on episode 20, to talk about his recent book with Jennifer Hill and Aki Vehtari, “Regression and Other Stories”. He’s a professor of statistics and political science at Columbia University and works on a lot of topics, including: why campaign polls are so variable while elections are so predictable, the statistical challenges of estimating small effects, and methods for surveys and experimental design.
Merlin is a PhD student in Political Science at Columbia University, and he specializes in political methodology. Prior to his PhD, he did a Bachelor's in Political Science at the Freie Universität Berlin.
I hope you’ll enjoy this episode where we dove into the Bayesian model they helped develop for The Economist, and talked more generally about how to forecast elections with statistical methods, and even about the incentives the forecasting industry has as a whole.
Thank you to my Patrons for making this episode possible! Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
I don’t know about you, but I’m starting to really miss traveling and just talking to people without having to think about masks, social distance and activating the covid tracking app on my phone. In the coming days, there is one event that, granted, won’t make all of that disappear, but will remind me how enriching it is to meet new people — this event is PyMCon, the first-ever conference about the PyMC ecosystem! To talk about the conference format, goals and program, I had the pleasure to host Ravin Kumar and Quan Nguyen on the show.
Quan is a PhD student in computer science at Washington University in St Louis, USA, researching Bayesian machine learning and one of the PyMCon program committee chairs. He is also the author of several programming books on Python and scientific computing.
Ravin is a core contributor to Arviz and PyMC, and is leading the PyMCon conference. He holds a Bachelors in Mechanical Engineering and a Masters in Manufacturing Engineering. As a Principal Data Scientist he has used Bayesian Statistics to characterize and aid decision making at organizations like SpaceX and Sweetgreen. Ravin is also currently co-authoring a book with Ari Hartikainen, Osvaldo Martin, and Junpeng Lao on Bayesian Statistics due for release in February.
We talked about why they became involved in the conference, parsed through the numerous, amazing talks that are planned, and detailed who the keynote speakers will be… So, If you’re interested the link to register is in the show notes, and there are even two ways to get a free ticket: either by applying to a diversity scholarship, or by being a community partner, which is anyone or any organization working towards diversity and inclusion in tech — all links are in the show notes.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
Have you watched the series « The English Game », on Netflix? Well, I think you should — it’s a fascinating dive into how football went from an aristocratic to a popular sport in the late 19th century England. Today it is so popular that it became a valuable business to do statistics on the game and its players!
To talk about that, I invited Kevin Minkus on the show — he’s a data scientist and soccer fan living in Philadelphia. Kevin’s currently working at Monetate on ecommerce problems, and prior to Monetate he worked on property and casualty insurance pricing.
He spends a lot of his spare time working on problems in football analytics and is a contributor at American Soccer Analysis, a website and podcast dedicated to… football made or played in the US (or “soccer”, as they say over there). Kevin is responsible for some of their data management and devops, and he recently wrote a guide to football analytics for the Major League Soccer’s website, entitled « Soccer Analytics 101 ».
To be honest, I had a great time talking for one hour about two of my passions — football and stats! Soooo, maybe 2020 isn’t that bad after all… Ow, and beyond football, Kevin is also into the digital humanities, web development, 3D animation, machine learning, and… the bassoon!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
Do you know what proteins are, what they do and why they are useful? Well, be prepared to be amazed! In this episode, Seth Axen will tell us about the fascinating world of protein structures and computational biology, and how his work of Bayesian modeler fits into that!
Passionate about mathematics and statistics, Seth is finishing a PhD in bioinformatics at the Sali Lab of the University of California, San Francisco (UCSF). His research interests span the broad field of computational biology: using computer science, mathematics, and statistics to understand biological systems. His current research focuses on inferring protein structural ensembles.
Open source development is also very dear to his heart, and indeed he contributes to many open source packages, especially in the Julia ecosystem. In particular, he develops and maintains ArviZ.jl, the Julia port of ArviZ, a platform-agnostic python package to visualize and diagnose your Bayesian models. Seth will tell us how he became involved in ArviZ.jl, what its strengths and weaknesses are, and how it fits into the Julia probabilistic programming landscape.
Ow, and as a bonus, you’ll discover why Seth is such a fan of automatic differentiation, aka « autodiff » — I actually wanted to edit this part out but Seth strongly insisted I kept it. Just kidding of course — or, am I… ?
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
If you’ve studied at a business school, you probably didn’t attend any Bayesian stats course there. Well this isn’t like that in every business schools! Elea McDonnel Feit does integrate Bayesian methods into her teaching at the business school of Drexel University, in Philadelphia, US.
Elea is an Assistant Professor of Marketing at Drexel, and in this episode she’ll tell us which methods are the most useful in marketing analytics, and why.
Indeed, Elea develops data analysis methods to inform marketing decisions, such as designing new products and planning advertising campaigns. Often faced with missing, unmatched or aggregated data, she uses MCMC sampling, hierarchical models and decision theory to decipher all this.
After an MS in Industrial Engineering at Lehigh University and a PhD in Marketing at the University of Michigan, Elea worked on product design at General Motors and was most recently the Executive Director of the Wharton Customer Analytics Initiative.
Thanks to all these experiences, Elea loves teaching marketing analytics and Bayesian and causal inference at all levels. She even wrote the book R for Marketing Research and Analytics with Chris Chapman, at Springer Press.
In summary, I think you’ll be pretty surprised by how Bayesian the world of marketing is…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
If, like me, you’ve been stuck in a 40 square-meter apartment for two months, you’re going to be pretty jealous of Avi Bryant. Indeed, Avi lives on Galiano Island, Canada, not very far from Vancouver, surrounded by forest, overlooking the Salish Sea.
In this natural and beautiful — although slightly deer-infested — spot, Avi runs The Gradient Retreat Center, a place where writers, makers, and code writers can take a week away from their regular lives and focus on creative work. But it’s not only to envy him that I invited Avi on the show — it’s to talk about Bayesian inference in Scala, prior elicitation, how to deploy Bayesian methods at scale, and how to enable Bayesian inference for engineers.
While working at Stripe, Avi wrote Rainier, a Bayesian inference framework for Scala. Inference is based on variants of the Hamiltonian Monte Carlo sampler, and the implementation is similar to, and targets the same types of models as both Stan and PyMC3. As Avi says, depending on your background, you might think of Rainier as aspiring to be either "Stan, but on the JVM", or "TensorFlow, but for small data".
In this episode, Avi will tell us how Rainier came into life, how it fits into the probabilistic programming landscape, and what its main strengths and weaknesses are.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
I bet you heard a lot about epidemiological compartmental models such as SIR in the last few months? But what are they exactly? And why are they so useful for epidemiological modeling?
Elizaveta Semenova will tell you why in this episode, by walking us through the case study she recently wrote with the Stan team. She’ll also tell us how she used Gaussian Processes on spatio-temporal data, to study the spread of Malaria, or to fit dose-response curves in pharmaceutical tests.
And finally, she’ll tell us how she used Bayesian neural networks for drug toxicity prediction in her latest paper, and how Bayesian neural nets behave compared to classical neural nets. Ow, and you’ll also learn an interesting link between BNNs and Gaussian Processes…
I know: Liza works on _a lot_ of projects! But who is she? Well, she’s a postdoctorate in Bayesian Machine Learning at the pharmaceutical company AstraZeneca, in Cambridge, UK.
Elizaveta did her masters in theoretical mathematics in Moscow, Russia, and then worked in financial services as an actuary in various European countries. She then did a PhD in epidemiology at the University of Basel, Switzerland. This is where she got interested in health applications – be it epidemiology, global health or more small-scale biological questions. But she’ll tell you all that in the episode ;)
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
Once upon a time, there was an enchanted book filled with hundreds of little plots, applied examples and linear regressions — the prettiest creature that was ever seen. Its authors were excessively fond of it, and its readers loved it even more. This magical book had a nice blue cover made for it, and everybody aptly called it « Regression and other Stories »!
As every good fairy tale, this one had its share of villains — the traps where statistical methods fall and fail you; the terrible confounders, lurking in the dark; the ill-measured data that haunt your inferences! But once you defeat these monsters, you’ll be able to think about, build and interpret regression models.
This episode will be filled with stories — stories about linear regressions! Here to narrate these marvelous statistical adventures are Andrew Gelman, Jennifer Hill and Aki Vehtari — the authors of the brand new Regression and other Stories.
Andrew is a professor of statistics and political science at Columbia University. Jennifer is a professor of applied statistics at NYU. She develops methods to answer causal questions related to policy research and scientific development. Aki is an associate professor in computational probabilistic modeling at Aalto University, Finland.
In this episode, they tell us why they wrote this book, who it is for and they also give us their 10 tips to improve your regression modeling! We also talked about the limits of regression and about going to Mars…
Other good news: until October 31st 2020, you can go to http://www.cambridge.org/wm-ecommerce-web/academic/landingPage/GoodBayesian2020 and buy the book with a 20% discount by entering the promo code “GoodBayesian2020” upon checkout!
That way, you’ll make up your own stories before going to sleep and dream of a world where we can easily generalize from sample to population, and where multilevel regression with poststratification is a bliss…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
Do you know Turing? Of course you do! With Soss and Gen, it’s one of the blockbusters to do probabilistic programming in Julia. And in this episode Cameron Pfiffer will tell us all about it — how it came to life, how it fits into the probabilistic programming landscape, and what its main strengths and weaknesses are.
Cameron did some Rust, some Python, but he especially loves coding in Julia. That’s also why he’s one of the core-developers of Turing.jl. He’s also a PhD student in finance at the University of Oregon and did his master’s in finance at the University of Reading. His interests are pretty broad, from cryptocurrencies, algorithmic and high-frequency trading, to AI in financial markets and anomaly detection – in a nutshell he’s a fan of topics where technology is involved.
As he’s the first economist to come to the show, I also asked him how Bayesian the field of economics is, why he thinks economics is quite unique among the social sciences, and how economists think about causality — I later learned that this topic is pretty controversial!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
I hope you’re all safe! Some of you also asked me if I had set up a Patreon so that they could help support the show, and that’s why I’m sending this short special episode your way today. I had thought about that, but I wasn’t sure there was a demand for this. Apparently, there is one — at least a small one — so, first, I wanna thank you and say how grateful I am to be in a community that values this kind of work!
The Patreon page is now live at patreon.com/learnbayesstats. It starts as low as 3€ and you can pick from 4 different tiers:
Before telling you the best part: I already have a lot of ideas for exclusive content and options. I first need to see whether you're as excited as I am about it. If I see you are, I'll be able to add new perks to the tiers! So give me your feedback about the current tiers or any benefits you'd like to see there... but don't see yet! BTW, you have a new way to do that now: sending me voice messages at anchor.fm/learn-bayes-stats/message!
Now, the icing on the cake: until July 31st, if you choose the "Full Posterior" tier (5$) or higher, you get early access to the very special episode I'm planning with Andrew Gelman, Jennifer Hill and Aki Vehtari about their upcoming book, "Regression and other stories". To top it off, there will be a promo code in the episode to buy the book at a discount price — now, that is an offer you can't turn down!
Alright, that is it for today — I hope you’re as excited as I am for this new stage in the podcast’s life! Please keep the emails, the tweets, the voice messages, the carrier pigeons coming with your feedback, questions and suggestions.
In the meantime, take care and I’ll see you in the next episode — episode 19, with Cameron Pfiffer, who’s the first economist to come on the show and who’s a core-developer of Turing.jl. We’re gonna talk about the Julia probabilistic programming landscape, Bayes in economics and causality — it’s gonna be fun ;)
Again, patreon.com/learnbayesstats if you want to support the show and unlock some nice perks. Thanks again, I am very grateful for any support you can bring me!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
---
Send in a voice message: https://anchor.fm/learn-bayes-stats/message
How do you design a good experimental study? How do you even know that you’re asking a good research question? Moreover, how can you align funding and publishing incentives with the principles of an open source science?
Let’s do another “big picture” episode to try and answer these questions! You know, these episodes that I want to do from time to time, with people who are not from the Bayesian world, to see what good practices there are out there. The first one, episode 15, was focused on programming and python, thanks to Michael Kennedy.
In this one, you’ll meet Daniel Lakens. Daniel is an experimental psychologist at the Human-Technology Interaction group at Eindhoven University of Technology, in the Netherlands. He’s worked there since 2010, when he received his PhD in social psychology.
His research focuses on how to design and interpret studies, applied meta-statistics, and reward structures in science. Daniel loves teaching about research methods and about how to ask good research questions. He even crafted free Coursera courses about these topics.
A fervent advocate of open science, he prioritizes scholar articles review requests based on how much the articles adhere to Open Science principles. On his blog, he describes himself as ‘the 20% Statistician’. Why? Well, he’ll tell you in the episode…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Have you already encountered a model that you know is scientifically sound, but that MCMC just wouldn’t run? The model would take forever to run — if it ever ran — and you would be greeted with a lot of divergences in the end. Yeah, I know, my stress levels start raising too whenever I hear the word « divergences »…
Well, you’ll be glad to hear there are tricks to make these models run, and one of these tricks is called re-parametrization — I bet you already heard about the poorly-named non-centered parametrization?
Well fear no more! In this episode, Maria Gorinova will tell you all about these model re-parametrizations! Maria is a PhD student in Data Science & AI at the University of Edinburgh. Her broad interests range from programming languages and verification, to machine learning and human-computer interaction.
More specifically, Maria is interested in probabilistic programming languages, and in exploring ways of applying program-analysis techniques to existing PPLs in order to improve usability of the language or efficiency of inference.
As you’ll hear in the episode, she thinks a lot about the language aspect of probabilistic programming, and works on the automation of various “tricks” in probabilistic programming: automatic re-parametrization, automatic marginalization, automatic and efficient model-specific inference.
As Maria also has experience with several PPLs like Stan, Edward2 and TensorFlow Probability, she’ll tell us what she thinks a good PPL design requires, and what the future of PPLs looks like to her.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
A librarian, a philosopher and a statistician walk into a bar — and they can’t find anybody to talk to; nobody seems to understand what they are talking about. Nobody? No! There is someone, and this someone is Will Kurt!
Will Kurt is the author of ‘Bayesian Statistics the Fun Way’ and ‘Get Programming With Haskell’. Currently the lead Data Scientist for the pricing and recommendations team at Hopper, he also blogs about stats and probability at countbayesie.com.
In this episode, he’ll tell us how a Boston librarian can become a Data Scientist and work with Bayesian models everyday. He’ll also explain the value of Bayesian inference from a philosophical standpoint, why it’s useful in the travel industry and how his latest book came into life.
Finally, Will is also a big fan of the “mind projection fallacy”, an informal fallacy first described by physicist and Bayesian philosopher Edwin Thompson Jaynes. Does that intrigue you? Well, stay tuned, he’ll tell us more in the episode…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
This is it folks! This is the first of the special episodes I want to do from time to time, to expand our perspective and get inspired by what’s going on elsewhere. The guests will not come directly from the Bayesian world, but will still be related to science or programming.
For the first episode of the kind, I had the chance to chat with Michael Kennedy! Michael is not only a very knowledgeable and respected member of the Python community, he’s also the founder and host of Talk Python To Me, the most popular Python podcast. He’s the founder and chief author at Talk Python Training, where he develops many Python developer online courses.
And before that, Michael was a professional software trainer for over 10 years – he has taught numerous developers throughout the world! But Michael is not only an entrepreneur and teacher – he’s also a father, a husband, and a proud inhabitant of Portland, OR!
As you’ll hear, our conversation spanned a large array of topics — the role of Python in science and research; how it came to be so important in data science, and why; what are Python’s threats and weaknesses and how it should evolve to not become obsolete. Michael also has interesting thoughts on the role of programming in education and how it relates to geometry — but I’ll let you discover that one by yourself…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
I bet you love penguins, right? The same goes for koalas, or puppies! But what about sharks? Well, my next guest loves sharks — she loves them so much that she works a lot with marine biologists, even though she’s a statistician!
Vianey Leos Barajas is indeed a statistician primarily working in the areas of statistical ecology, time series modeling, Bayesian inference and spatial modeling of environmental data. Vianey did her PhD in statistics at Iowa State University and is now a postdoctoral researcher at North Carolina State University.
In this episode, she’ll tell us what she’s working on that involves sharks, sheep and other animals! Trying to model animal movements, Vianey often encounters the dreaded multimodal posteriors. She’ll explain why these can be very tricky to estimate, and why ecological data are particularly suited for hidden Markov models and spatio-temporal models — don’t worry, Vianey will explain what these models are in the episode!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Book recommendations:
How is Julia doing? I’m talking about the programming language, of course! What does the probabilistic programming landscape in Julia look like? What are Julia’s distinctive features, and when would it be interesting to use it?
To talk about that, I invited Chad Scherrer. Chad is a Senior Research Scientist at RelationalAI, a company that uses Artificial Intelligence technologies to solve business problems.
Coming from a mathematics background, Chad did his PhD at Indiana University of Bloomington and has been working in statistics and data science for a decade now. Through this experience, he’s been using and developing probabilistic programming languages – so he’s familiar with python, R, PyMC, Stan and all the blockbusters of the field.
But since 2018, he’s particularly interested in Julia and developed Soss, an open-source lightweight probabilistic programming package for Julia. In this episode, he’ll tell us why he decided to create this package, and which choices he made that made Soss what it is today. But we’ll also talk about other projects in Julia, like Turing or Gen for instance.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Do you know Google Summer of Code? It’s a time of year when students can contribute to open-source software by developing and adding much needed functionalities to the open-source package of their choice. And Demetri Pananos did just that.
He did it in 2019 with PyMC3, for which he developed the API for ordinary differential equations. In this episode, he’ll tell us why and how he did that, what he learned from the experience, and what the strengths and weaknesses of the API are in his opinion.
Demetri is a Ph.D candidate in Biostatistics at Western University, in Ontario, Canada. His research interests surround machine learning and Bayesian statistics for personalized medicine. He earned his Master’s in Applied Mathematics from The University of Waterloo and is a firm believer in open science, interdisciplinary collaboration, and reproducible research.
Other than that, he loves plotting data and drinking IPA beer – well, who doesn’t?”
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
I bet you already heard about hierarchical models, or multilevel models, or varying-effects models — yeah this type of models has a lot of names! Many people even turn to Bayesian tools to build _exactly_ these models. But what are they? How do you build and use a hierarchical model? What are the tricks and classical traps? And even more important: how do you _interpret_ a hierarchical model?
In this episode, Thomas Wiecki will come to the rescue and explain what multilevel models are, how to build them, what their powers are… but also why you should be very careful when building them…
Does the name Thomas Wiecki ring a bell? Probably because he’s the host and creator of the PyData Deep Dive Podcast, where he interviews open-source contributors from the Python and Data Science worlds! Thomas is also the VP of Data Science at Quantopian, a crowd-sourced quantitative investment firm that encourages people everywhere to write investment algorithms.
Finally, Thomas is a longtime Bayesian and core-developer of PyMC3, a fantastic python package to do probabilistic programming in Python. On his blog, he publishes tutorial articles and explores new ideas such as Bayesian Deep Learning. Caring a lot about open-source software sustainability, he puts all he’s up to on his Patreon page, that you’ll find in the show notes.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
How do you handle your MCMC samples once your Bayesian model fit properly? Which diagnostics do you check to see if there was a computational problem? And isn’t that nice when you have beautiful and reliable plots to complement your analysis and better understand your model?
I know what you think: plotting can be long and complicated in these cases. Well, not with ArviZ, a platform-agnostic package to do exploratory analysis of your Bayesian models. And in this episode, Ari Hartikainen will tell you why.
Ari is a data-scientist in geophysics and a researcher at the Department of Civil Engineering of Aalto University in Finland. He mainly works on geophysics, Bayesian statistics and visualization.
Ari’s also a prolific open-source contributor, as he’s a core-developer of the popular Stan and ArviZ libraries. He’ll tell us how PyStan interacts with ArviZ, what he thinks ArviZ most useful features are, and which common difficulties he encounters with his models and data.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Have you always wondered what dark matter is? Can we even see it — let alone measure it? And what would discover it imply for our understanding of the Universe?
In this episode, we’ll take look at the cosmos with Maggie Lieu. She’ll tell us what research in astrophysics is made of, what model she worked on at the European Space Agency, and how Bayesian the world of space science is.
Maggie Lieu did her PhD in the Astronomy & Space Department of the University of Birmingham. She’s now a Research Fellow of Machine Learning & Cosmology at the University of Nottingham and is working on projects in preparation for Euclid, a space-based telescope whose goal is to map the dark Universe and help us learn about the nature of dark matter and dark energy.
In a nutshell, she tries to help us better understand the entire cosmos. Even more amazing, she uses the Stan library and applies Bayesian statistical methods to decipher her astronomical data! But Maggie is not just a Bayesian astrophysicist: she also loves photography and rock-climbing!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
What is it like using Bayesian tools when you’re a software engineer or computer scientist? How do you apply these tools in the online ad industry?
More generally, what is Bayesian thinking, philosophically? And is it really useful in every day life? Because, well you can’t fire up MCMC each time you need to make a quick decision under uncertainty… So how do you do that in practice, when you have at most a pen and paper?
In this episode, you’ll hear Max Sklar’s take on these questions. Max is a software engineer with a focus on machine learning and Bayesian inference. Now working at Foursquare’s innovation lab, he recently led the development of a causality model for Foursquare’s Ad Attribution product and taught a course on Bayesian Thinking at the Lviv Data Science Summer School.
Max is also an open-source enthusiast and a fellow podcaster – he’s the host of the Local Maximum podcast, where you can hear every week about the latest trends in AI, machine learning and technology from an engineering perspective.
Ow, and if you liked the movie « Her », with Joaquin Phoenix, well you’re in for a treat at the end of this episode…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
You can’t study psychology up until your PhD and end-up doing very mathematical and computational data science at Google right? It’s too hard of a U-turn — some would even say it’s NUTS, just because they like bad puns… Well think again, because Junpeng Lao did just that!
Before doing data science at Google, Junpeng was a cognitive psychology researcher at the University of Fribourg, Switzerland. Working in Python, Matlab and occasionally in R, Junpeng is a prolific open-source contributor, particularly to the popular TensorFlow and PyMC3 libraries. He also maintains the PyMC Discourse on his free time, where he amazingly answers all kinds of various and very specific questions!
In this episode, he’ll tell you what the core characteristics of TensorFlow Probability are, and when you would use TFP instead of another probabilistic programming framework, like Stan or PyMC3. He’ll also explain why PyMC4 will be based on TensorFlow Probability itself, and what future contributions he has in mind for these two amazing libraries. Finally, Junpeng will share with you his workflow for debugging a model, or just for better understanding your models.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
If you’re there, it’s probably because you’re interested in Bayesian inference, right? But don’t you feel lost sometimes when building a model? Or you ask yourself why what you’re trying to do is so damn hard… and you conclude that YOU are the problem, that YOU must be doing something wrong!
Well, rest assured, as you’ll hear from Michael Betancourt himself: it’s hard for everybody! That’s why over the years he developed and tries to popularize what he calls a « principled Bayesian workflow » — in a nutshell, think about what could have generated your data; and always question default settings!
With that workflow, you’ll probably feel less alone when modeling, but expect to fail often. That’s ok — as Michael says: if you don’t fail, you don’t learn!
Who is Michael Betancourt you ask? He is a physicist and statistician, whose research focuses on the development of robust statistical workflows, computational tools, and pedagogical resources that help bridge the gap between statistical theory and scientific practice.
Michael works a lot on differential geometry and probability theory, and he often lives in high-dimensional spaces, where he meets with a good friend of his -- Hamiltonian Monte Carlo. Then, you won’t be surprised to learn that Michael is one of the core developers of the seminal probabilistic programming language Stan.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
I have two questions for you: Are you a self-learner? Then how do you stay up to date? What should you focus on if you’re a beginner, or if you’re more advanced?
And here is my second question: Are you working in biomedicine? And if you do, are you using Bayesian tools? Then how do you get your co-workers more used to posterior distributions than p-values? In other words, how do you change behaviors in a large organization?
In this episode, Eric Ma will answer all these questions and even tell us his favorite modeling techniques, which problems he encountered with these models, and how he solved them. He’ll also share with us the software-engineering workflow he uses at Novartis to share his work with colleagues.
Eric is a data scientist at the Novartis Institutes for Biomedical Research, where he focuses on Bayesian statistical methods to make medicines for patients. Eric is also a prolific open source developer: he led the development of pyjanitor, an API for cleaning data in Python, and nxviz, a visualization package for NetworkX. He also contributes to PyMC3, matplotlib and bokeh.
This is « Learning Bayesian Statistics », episode 5, recorded October 21, 2019.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
What do neurodegenerative diseases, gerrymandering and ecological inference all have in common? Well, they can all be studied with Bayesian methods — and that’s exactly what Karin Knudson is doing.
In this episode, Karin will share with us the vital and essential work she does to understand aspects of neurodegenerative diseases. She’ll also tell us more about computational neuroscience and Dirichlet processes — what they are, what they do, and when you should use them.
Karin did her doctorate in mathematics, with a focus on compressive sensing and computational neuroscience at the University of Texas at Austin. Her doctoral work included applying hierarchical Dirichlet processes in the setting of neural data and focused on one-bit compressive sensing and spike-sorting.
Formerly the chair of the math and computer science department of Phillips Academy Andover, she started a postdoc at Mass General Hospital and Harvard Medical in Fall 2019. Most importantly, rock climbing and hiking have no secrets for her!
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show, personally curated by Karin Knudson:
How can you use Bayesian tools and optimize your models in industry? What are the best ways to communicate and visualize your models with non-technical and executive people? And what are the most common pitfalls?
In this episode, Colin Carroll will tell us how he did all that in finance and the airline industry. He’ll also share with us what the future of probabilistic programming looks like to him.
You already heard from Colin two weeks ago — so, if you didn’t catch this episode, go back in your feed’s history and enjoy the first part!
As a reminder, Colin is a machine learning researcher and software engineer who’s notably worked on modeling risk in the airline industry and building NLP-powered search infrastructure for finance. He’s also an active contributor to open source, particularly to the popular PyMC3 and ArviZ libraries.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/!
Links from the show:
---
Send in a voice message: https://anchor.fm/learn-bayes-stats/message
When speaking about Bayesian statistics, we often hear about « probabilistic programming » — but what is it? Which languages and libraries allow you to program probabilistically? When is Stan, PyMC, Pyro or any other probabilistic programming language most appropriate for your project? And when should you even use Bayesian libraries instead of non-bayesian tools, like Statsmodels or Scikit-learn?
Colin Carroll will answer all these questions for you. Colin is a machine learning researcher and software engineer who’s notably worked on modeling risk in the airline industry and building NLP-powered search infrastructure for finance. He’s also an active contributor to open source, particularly to the popular PyMC3 and ArviZ libraries.
Having studied geometric measure theory at Rice University, Colin was bound to walk in the woods with Pete the pup – who was there when we recorded by the way – and to launch balloons into near-space in his spare time.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/!
Links from the show:
When are Bayesian methods most useful? Conversely, when should you NOT use them? How do you teach them? What are the most important skills to pick-up when learning Bayes? And what are the most difficult topics, the ones you should maybe save for later?
In this episode, you’ll hear Chris Fonnesbeck answer these questions from the perspective of marine biology and sports analytics. Chris is indeed the New York Yankees’ senior quantitative analyst and an associate professor at Vanderbilt University School of Medicine.
He specializes in computational statistics, Bayesian methods, meta-analysis, and applied decision analysis. He also created PyMC, a library to do probabilistic programming in python, and is the author of several tutorials at PyCon and PyData conferences.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com!
Links from the show:
What do you get when you put a physicist, a biologist and a data scientist in the same body? Well, you’re about to find out…
In this episode you’ll meet Osvaldo Martin. Osvaldo is a researcher at the National Scientific and Technical Research Council in Argentina and is notably the author of the book Bayesian Analysis with Python, whose second edition was published in December 2018.
He also teaches bioinformatics, data science and Bayesian data analysis, and is a core developer of PyMC3 and ArviZ, and recently started contributing to Bambi. Originally a biologist and physicist, Osvaldo trained himself to python and Bayesian methods – and what he’s doing with it is pretty amazing!
We also touch on how accepted are Bayesian methods in his field, which models he’s currently working on, and what it’s like to be an open-source developer.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com!
Links from the show:
Are you a researcher or data scientist / analyst / ninja? Do you want to learn Bayesian inference, stay up to date or simply want to understand what Bayesian inference is?
Well I'm just like you! When I started learning Bayesian methods, I really wished there were a podcast out there that could introduce me to the methods, the projects and the people who make all that possible.
So I created "Learning Bayesian Statistics", a fortnightly podcast where I interview researchers and practitioners of all fields about why and how they use Bayesian statistics, and how in turn YOU, as a learner, can apply these methods in YOUR modeling workflow. Now the thing is, I’m not a beginner, but I’m not an expert either. The people I’ll interview will definitely be. So I’ll be learning alongside you. I won’t pretend to know everything in this podcast, and I WILL make mistakes. But thanks to the guests’ feedback, we’ll be able to learn from those mistakes, and I think this will help you (and me!) become better, faster, stronger Bayesians.
So, whether you want to learn Bayesian statistics or hear about the latest libraries, books and applications, this podcast is for you. In this very first episode - well actually it’s episode 0, because 0-indexing rules! - I will introduce you to the genesis of this podcast, tell you why you should listen and reveal some of the guests for the coming episodes.
Come join us!
Links from the show:
En liten tjänst av I'm With Friends. Finns även på engelska.