Research Glossary Simulator Docs Novels Get Certified
AI Trust Glossary  ·  Canonical Definition

Drift (Model Drift)

Gradual degradation of AI model performance over time as real-world data distributions shift away from those seen during training.
Borealis Research Team  ·  Updated March 2026  ·  View all 47 terms
Drift happens silently. No error is thrown. The model appears functional while outputs become increasingly wrong. Types include data drift (input distribution changes), concept drift (the relationship between inputs and correct outputs changes), and model drift (degradation from both).
Drift is how trusted agents become untrustworthy without anyone noticing. Detecting drift requires continuous measurement, not periodic review. A steadily worsening BM Score trend is the earliest detectable signal.
BM Score trends across telemetry batches serve as the drift signal. A steadily declining behavioral consistency or rising anomaly rate score is the earliest detectable symptom. The license_score_history table enables trend analysis invisible in point-in-time audits.
Ready to put this into practice?
Certify your AI agent on BorealisMark and get a verifiable BM Score anchored to Hedera Hashgraph. Or run the BM Score Simulator to estimate your agent's score right now.