Tutorial | Monitoring models: An API endpoint on a Dataiku API node#

Many data science workloads call for a real-time API framework, where queries sent to an API endpoint receive an immediate response.

As a means of comparison to other deployment contexts, this article presents how to monitor a model under a real-time API framework staying entirely within Dataiku.

Objectives#

In this tutorial, you will:

  • Create a model monitoring feedback loop on an API endpoint deployed on a Dataiku API node.

Prerequisites#

In addition to the prerequisites laid out in the introduction, you’ll also need:

Note

Dataiku Cloud users should consult documentation for setting up an API node from the Launchpad and accessing API query logs.

Deploy the model as an API endpoint#

The starter project actually already contains the API endpoint that we want to monitor, and so the next step is pushing a version of an API service including the endpoint to the API Deployer.

  1. From the top navigation bar, navigate to the API Designer from within the More Options menu.

  2. Open the pokemon API service.

  3. Note how it includes one prediction endpoint called guess using the model found in the Flow.

  4. Click Publish on Deployer, and OK to confirm publishing v1 of the service to the API Deployer.

Dataiku screenshot of an API service with a prediction endpoint.

Once on the API Deployer, we can actually deploy the service to an infrastructure.

  1. On the API Deployer, find the API version that you just pushed to the API Deployer, and click Deploy.

  2. Select the configured API infrastructure, click Deploy, and again to confirm.

Dataiku screenshot of an API deployment.

Note

To review the mechanics of real-time API deployment in greater detail, please see the Tutorial | Real-time API basics.

Generate activity on the API endpoint#

Before we set up the monitoring portion of this project, we need to generate some activity on the API endpoint so that we have actual data on the API node to retrieve in the feedback loop.

  1. When viewing the deployment on the API Deployer, navigate to the Run and test tab for the guess endpoint.

  2. Click Run All to send several test queries to the API node.

Dataiku screenshot of test queries of an API deployment.

Create a feedback loop on the API endpoint#

Now direct your attention to the Dataiku Monitoring (API) Flow zone. Just like the batch Flow zone, we have an Evaluate recipe that takes two inputs (a dataset of predictions and a saved model) and outputs a model evaluation store. However, there are two subtle differences.

Dataiku screenshot of a Flow zone for monitoring API node log data.

API node log data#

The input data in this context comes directly from the API node. We need to point the pokemon_on_static_api_logs dataset to the storage of the API endpoint prediction logs according to the Event server’s configuration. (An admin can find this information under Administration > Settings > Event Server on the Design node).

  1. Open the pokemon_on_static_api_logs dataset. There will be a warning that it is empty.

  2. Navigate to the Settings tab.

  3. In the Files subtab, select the Read from connection specific to the configuration of your Event server.

  4. Click Browse to navigate the file directory, and find the Path specific to the configuration of your Event server.

  5. Click api-node-query, and then select the name of the API deployment for this project.

  6. Click OK, and see a path ending with your API deployment.

  7. Click List Files to observe which logs are available, and Save when ready.

Dataiku screenshot of the settings tab of API node log data.

Note

If using Dataiku Cloud, you can access API query logs from the S3 connection customer-audit-logs within the path apinode-audit-logs.

After pointing this dataset to the correct prediction logs, we can now explore it. Each row is an actual prediction request answered by our model. You can find all the features that were requested, the resulting prediction, with details and other technical data.

Dataiku screenshot of the Explore tab of API node log data fetched from the Event server.

Warning

Although we are showing a local filesystem storage for the API node logs to make the project import easier, in a real situation, any file-based cloud storage is highly recommended. This data can grow quickly, and it will not decrease unless explicitly truncated.

It would also be common to activate partitioning for this dataset.

The Evaluate recipe with API node logs as input#

Another subtle difference between the Evaluate recipe in the API Flow zone compared to the Batch Flow zone is the option to automatically handle the input data as API node logs.

Dataiku screenshot of an Evaluate recipe with API node log input data.

With this activated (detected by default), you do not need to care about all the additional columns or the naming.

  1. Observe the Settings tab of the Evaluate recipe in the Dataiku Monitoring (API) Flow zone.

  2. Click Run to produce a model evaluation of the API node logs.

Note

If using a version of Dataiku prior to 11.2, you will need to add a Prepare recipe to keep only the features and prediction columns, and rename them to match the initial training dataset convention.

Create a one-click monitoring loop#

Note

This feature is not available for Dataiku Cloud users. Refer to the instructions above for setting up this feedback loop.

After understanding these details, you should also be aware that since version 12, users can simplify this process by building the entire feedback loop directly from the API endpoint in the API Designer.

  1. On the Design node, navigate to the API Designer from the More options menu of the top navigation bar.

  2. Open the pokemon API service, and click on the Monitoring panel for the guess endpoint.

  3. Click Configure to create a monitoring loop for this endpoint.

  4. Click OK, and then return to the Flow to see the new zone, which, in this case, duplicates the work of the existing Dataiku Monitoring (API) Flow zone.

Dataiku screenshot of the Monitoring panel within the API endpoint of the API Designer.

What’s next?#

Having seen the monitoring setup for an API endpoint on a Dataiku API node, you might want to move on to one of the following monitoring cases that reuse the same project. They can be completed in any order independently of each other.