Sveriges mest populära poddar

LessWrong (30+ Karma)

“Thermodynamic entropy = Kolmogorov complexity” by EbTech

2 min • 17 februari 2025
This is a link post.

Direct PDF link for non-subscribers

Information theory must precede probability theory, and not be based on it. By the
very essence of this discipline, the foundations of information theory have a finite combinatorial character.

- Andrey Kolmogorov

Many alignment researchers borrow intuitions from thermodynamics: entropy relates to information, which relates to learning and epistemology. These connections were first revealed by Szilárd's resolution of Maxwell's famous thought experiment. However, the classical tools of equilibrium thermodynamics are not ideally suited to studying information processing far from equilibrium.

This new work reframes thermodynamics in terms of the algorithmic entropy. It takes an information-first approach, delaying the introduction of physical concepts such as heat and energy until after the foundations are set. I find this approach more conceptually principled and elegant than the traditional alternatives.

It's based on a 30-year-old workshop paper by Péter Gács[1], which until now was [...]

The original text contained 1 footnote which was omitted from this narration.

---

First published:
February 17th, 2025

Source:
https://www.lesswrong.com/posts/d6D2LcQBgJbXf25tT/thermodynamic-entropy-kolmogorov-complexity

---

Narrated by TYPE III AUDIO.

00:00 -00:00