Sveriges mest populära poddar

Riskgaming

Alternate Histories and GPT-3

16 min • 17 juni 2022

"GPT-3 was trained on is so large that the model contains a certain  fraction of the actual complexity of the world. But how much is actually  inside these models, implicitly embedded within these neural networks?

I  decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman

From Cabinet of Wonders newsletter by Samuel Arbesman

Great tweet thread summarizing his post

"Securities" podcast is produced and edited by Chris Gates

Kategorier
Förekommer på
00:00 -00:00