- Tags:: #📝CuratedNotes , [[Data science|Data Science]]
There are several phenomena we should pay attention to:
- Real [[model-performance|Model Performance]], for those cases where we get the label of live data.
- Early indicators of possible bad performance. We can distinguish between:
- Single prediction indicators :
- Broken data integrity of features [[model-sanity-checks|Model Sanity Checks]].
- [[model-outlier-detection|Model Outlier Detection]] or trespassed hard limits in features.
- Outliers or trespassed hard limits in predictions.
- Distribution indicators [[Model drifting|Model Drifting]]:
- **Distribution shift of predictions.**
- Distribution shift of features.
Tools:
- GCP has [[ai-platform|Ai Platform]]. From https://cloud.google.com/ai-platform/docs/ml-solutions-overview:
- You deploy by uploading your model to GCS.
- [[2021-01-08]] Monitoring is in beta, and it is called "Continuous Evaluation": https://cloud.google.com/ai-platform/prediction/docs/continuous-evaluation/view-metrics but in only supports image, text or general classification.