Techniques

Perplexity

Definition

A metric that measures how well a language model predicts text, with lower values indicating better performance. Perplexity quantifies the model's uncertainty — a perplexity of 10 means the model is as uncertain as if it were choosing uniformly among 10 options for each token. Widely used for evaluating language models.

Related Terms

No related terms linked yet.

Explore all terms →

Explore companies in this space

Techniques Companies

View Techniques companies