Sveriges mest populära poddar

Large Language Model (LLM) Talk

GPT-3

12 min • 14 januari 2025

GPT3 is a large language model that generates text based on its training on a massive dataset of 300 billion tokens. It outputs text one token at a time, influenced by input text. The model encodes what it learns in 175 billion parameters and has a context window of 2048 tokens. The core calculations happen within 96 transformer decoder layers, each with 1.8 billion parameters. Words are converted to vectors, a prediction is made, and the result is converted back to a word. The input flows through the layer stack, with each word fed back into the model. Priming examples are included as input. Fine-tuning can update model weights to improve performance for specific tasks.

Kategorier
Förekommer på
00:00 -00:00