Core Concepts
Weight
Definition
“
A numerical parameter in a neural network that determines the strength of the connection between neurons. During training, weights are adjusted to minimize prediction errors. A model's weights collectively encode its learned knowledge. GPT-4 is estimated to have over a trillion weights, while smaller models may have billions.
”
Related Terms
No related terms linked yet.
Explore all terms →