Safety
Data Poisoning
Definition
“
An attack where malicious data is injected into a training dataset to compromise an AI model's behavior. Data poisoning can cause models to produce incorrect outputs for specific inputs while appearing normal otherwise. Defending against data poisoning requires careful data curation and validation pipelines.
”
Related Terms
No related terms linked yet.
Explore all terms →