In this episode, we discuss the limitations and challenges of scaling large language models, focusing on data movement bottlenecks and the so-called 'latency wall' that presents hurdles to future advancements in AI training.
Reference: https://epochai.org/blog/data-movement-bottlenecks-scaling-past-1e28-flop