What do you do when your data science project doesn’t fit within your computer’s memory? One solution is to distribute it across multiple worker machines. This week on the show, Guido Imperiale from Coiled talks about Dask and managing large data science projects through distributed computing.
We talk about projects where an orchestration system like Dask will help. Dask is designed to take advantage of parallel computing, spreading the work and data across multiple machines. Many familiar techniques for working with pandas and NumPy data are supported with Dask equivalents.
We also discuss the differences between managed and unmanaged memory. Guido shares advice on how to tackle memory issues while working with Dask.
This week we also talk briefly with Jodie Burchell, who will be a guest host on upcoming episodes. As a data scientist, Jodie will be bringing new topics, projects, and discussions to the show.
Course Spotlight: Exploring Scopes and Closures in Python
In this Code Conversation video course, you’ll take a deep dive into how scopes and closures work in Python. To do this, you’ll use a debugger to walk through some sample code, and then you’ll take a peek under the hood to see how Python holds variables internally.
Topics:
Show Links:
Level up your Python skills with our expert-led courses: