Concept | The Result tab within the visual ML tool#

Watch the video

The Models page of a visual analysis consists of a Result tab that is useful for comparing model performance across different algorithms and training sessions. By default, the models are grouped by Session. However, we can select the Models view to assess all models in one window, or the Table view to see all models along with more detailed metrics.

../../_images/evaluate-the-model1.png

The Result tab also lets us evaluate different modeling sessions, and revert to previous model designs.

A session provides the following information: a chart of the models that were trained, the scores and details of each iteration of the grid search, and the amount of time that the full model training process took. Dataiku automatically selects the best performing model from the grid search.

../../_images/evaluate-the-model2.png

When training a model, you can also view model diagnostics. Model diagnostics help you detect common pitfalls such as overfitting and data leakage. Dataiku displays Diagnostics when an algorithm fails any of the diagnostics checks.

../../_images/model-diagnostics.png

Clicking on Diagnostics displays the Model Summary including the Training Information section.

../../_images/model-diagnostics-training-info.png

Diagnostics are configured in the Design tab under Debugging.

../../_images/model-diagnostics-design-config.png

See also

To find out more about the model summary, visit Concept | Model summaries within the visual ML tool.