Sveriges mest populära poddar

Interconnects

Interviewing Tim Dettmers on open-source AI: Agents, scaling, quantization and what's next

76 min • 7 november 2024

Tim Dettmers does not need an introduction for most people building open-source AI. If you are part of that minority, you’re in for a treat. Tim is the lead developer behind most of the open-source tools for quantization: QLoRA, bitsandbytes, 4 and 8 bit inference, and plenty more. He recently finished his Ph.D. at the University of Washington, is now a researcher at the Allen Institute for AI, and is starting as a professor at Carnegie Mellon University in fall of 2025.

Tim is a joy to talk to. He thinks independently on all the AI issues of today, bringing new perspectives that challenge the status quo. At the same time, he’s sincere and very helpful to work with, working hard to uplift those around him and the academic community. There’s a reason he’s so loved in the open-source AI community.

Find more about Tim on his Twitter or Google Scholar. He also has a great blog where he talks about things like which GPUs to buy and which grad school to choose.

Listen on Apple PodcastsSpotify, YouTube, and where ever you get your podcasts. For other Interconnects interviews, go here.

Show Notes

Here's a markdown list of companies, people, projects, research papers, and other key named entities mentioned in the transcript:

* QLoRA

* Bits and Bytes

* Llama 3

* Apple Intelligence

* SWE Bench

* RewardBench

* Claude (AI assistant by Anthropic)

* Transformers (Hugging Face library)

* Gemma (Google's open weight language model)

* Notebook LM

* LangChain

* LangGraph

* Weights & Biases

* Blackwell (NVIDIA GPU architecture)

* Perplexity

* Branch Train Merge (research paper)

* "ResNets do iterative refinement on features" (research paper)

* CIFAR-10 and CIFAR-100 (computer vision datasets)

* Lottery Ticket Hypothesis (research paper)

* OpenAI O1

* TRL (Transformer Reinforcement Learning) by Hugging Face

* Tim's work on quantization (this is just one example)

Timestamps

* [00:00:00] Introduction and background on Tim Dettmers

* [00:01:53] Future of open source AI models

* [00:09:44] SWE Bench and evaluating AI systems

* [00:13:33] Using AI for coding, writing, and thinking

* [00:16:09] Academic research with limited compute

* [00:32:13] Economic impact of AI

* [00:36:49] User experience with different AI models

* [00:39:42] O1 models and reasoning in AI

* [00:46:27] Instruction tuning vs. RLHF and synthetic data

* [00:51:16] Model merging and optimization landscapes

* [00:55:08] Knowledge distillation and optimization dynamics

* [01:01:55] State-space models and transformer dominance

* [01:06:00] Definition and future of AI agents

* [01:09:20] The limit of quantization

Transcript and full details: https://www.interconnects.ai/p/tim-dettmers

Get Interconnects (https://www.interconnects.ai/)...

... on YouTube: https://www.youtube.com/@interconnects

... on Twitter: https://x.com/interconnectsai

... on Linkedin: https://www.linkedin.com/company/interconnects-ai

... on Spotify: https://open.spotify.com/show/2UE6s7wZC4kiXYOnWRuxGv

… on Apple Podcasts: https://podcasts.apple.com/us/podcast/interconnects/id1719552353



Get full access to Interconnects at www.interconnects.ai/subscribe
Förekommer på
00:00 -00:00