Concept | Model maintenance

The last step in our governance framework is to monitor our items. Let’s outline how we can use model metrics to track and improve the performance of model versions.

Model metrics

Take a look at the Model registry page. The Model registry page provides a list of all of the models and model versions from your connected Dataiku nodes. We can also see valuable model metrics on this page.

Let’s see an example. Say we want to monitor the project Coupon Redemption. We can click the dropdown to find the active model version that we want to analyze: in this case, a random forest model.

Dataiku Govern screenshot highlighting the active version of a model.

By default, the Metric to Focus dropdown is set to ROC AUC. Because of this, a model version will display its ROC AUC and ROC AUC Drift metrics. You can change the Metric to Focus at any time.

To discover more model metrics, you can look at the Model metrics tab in the Details panel.

Dataiku Govern screenshot highlighting the review and delegate buttons in the review step of the workflow.

Note

Most of these metrics are the initial metrics drawn from the Design node or Automation node when building the model version. However, drift metrics come from the model evaluations that are stored in a Model Evaluation Stores (MES).

The MES must be held in the same project as the saved model of the model version we are evaluating. You can configure the MES to opt out of the Govern sync if needed. Otherwise, metrics are updated anytime the evaluation is run.

After monitoring metrics in the Govern node, you may decide to update your models in the design or automation node. To learn more about managing the model lifecycle, you can start the MLOps Practitioner learning path or see the MLOps section of the product documentation.