AI Trust Glossary · Canonical Definition
Drift (Model Drift)
Gradual degradation of AI model performance over time as real-world data distributions shift away from those seen during training.
Explanation
Drift happens silently. No error is thrown. The model appears functional while outputs become increasingly wrong. Types include data drift (input distribution changes), concept drift (the relationship between inputs and correct outputs changes), and model drift (degradation from both).
Why it matters
Drift is how trusted agents become untrustworthy without anyone noticing. Detecting drift requires continuous measurement, not periodic review. A steadily worsening BM Score trend is the earliest detectable signal.
How Borealis uses it
BM Score trends across telemetry batches serve as the drift signal. A steadily declining behavioral consistency or rising anomaly rate score is the earliest detectable symptom. The license_score_history table enables trend analysis invisible in point-in-time audits.