Tutorial | Deep learning for time series#

Get started#

As an alternative to traditional time series models like ARIMA, you can also use deep learning for forecasting.

Objectives#

In this tutorial, you will:

  • Build a long short-term memory (LSTM) network using Keras code within Dataiku’s visual machine learning.

  • Deploy it to the Flow as a saved model and apply it to new data.

Prerequisites#

  • Dataiku 12.0 or later.

  • A Full Designer user profile.

  • A code environment with the necessary libraries. When creating a code environment, add the Visual Deep Learning package set corresponding to your hardware.

  • Some familiarity with deep learning, and in particular, Keras.

Create the project#

  1. From the Dataiku Design homepage, click + New Project > DSS tutorials > ML Practitioner > Deep Learning for Time Series.

  2. From the project homepage, click Go to Flow (or g + f).

Note

You can also download the starter project from this website and import it as a zip file.

You’ll next want to build the Flow.

  1. Click Flow Actions at the bottom right of the Flow.

  2. Click Build all.

  3. Keep the default settings and click Build.

Use case summary#

The project contains a dataset of the daily minimum temperatures recorded in Australia over the course of a decade (1981-1990). The only data preparation that has been done is to parse the dates from a string into a date format.

Tip

In the Charts tab of the temperatures_prepared dataset, you’ll find a line chart of the temperature by date. It reveals that the data is quite noisy. Therefore our model will probably only learn the general trends.

Prepare the data#

The next step is to create windows of input values. We are going to feed the LSTM with windows of 30 temperature values, and expect it to predict the 31st.

Create windows with a Python recipe#

We can do this with a Python code recipe that serializes the window values in string format. The resulting dataset should have three columns:

  • The date of the target measurement.

  • A vector of 30 values of input measured temperatures.

  • The target temperature.

  1. From the Flow, select the temperatures_prepared dataset.

  2. In the Actions sidebar, select the Python recipe.

  3. Under Outputs, click + Add.

  4. Name the output dataset temperatures_window.

  5. Click Create Dataset.

  6. Click Create Recipe.

    Dataiku screenshot of the dialog for a Python recipe.
  7. Replace the starter code with the code below.

    import dataiku
    import pandas as pd, numpy as np
    from dataiku import pandasutils as pdu
    
    # Read recipe inputs
    generated_series = dataiku.Dataset("temperatures_prepared")
    df_data = generated_series.get_dataframe()
    
    steps = []
    x = []
    y = []
    
    ## Set the number of historical data points to use to predict future records
    window_size = 30
    
    ## Create windows of input values
    for i in range(len(df_data) - window_size - 1):
        subdf = df_data.iloc[i:i + window_size + 1]
        values = subdf['Temperature'].values.tolist()
        step = subdf['Date'].values.tolist()[-1]
    
        x.append(str(values[:-1]))
        steps.append(step)
        y.append(values[-1])
    
    df_win = pd.DataFrame.from_dict({'date': steps, 'inputs': x, 'target': y})
    
    # Write recipe outputs
    series_window = dataiku.Dataset("temperatures_window")
    series_window.write_with_schema(df_win)
    
  8. Click Run (or type @ + r + u + n) to execute the recipe, and then explore the output dataset.

Create a Split recipe#

Next, we can divide the dataset into training and testing sets. For convenience, we’ll use a visual recipe.

  1. If not already open, select the temperatures_window dataset.

  2. In the Actions sidebar, select Split from the menu of visual recipes.

  3. Click + Add, name the output temperatures_train, and click Create Dataset.

  4. Click + Add again, name the second output temperatures_test, and click Create Dataset.

  5. Once you have defined both output datasets, click Create Recipe.

Dataiku screenshot of the dialog to create a Split recipe.

Configure the Split recipe#

The model will be trained on the first eight years of data and then tested on the final two years of data.

  1. On the recipe’s Settings tab, select Dispatch percentiles of sorted data on output datasets.

  2. Select date as the sort order column.

  3. Assign 80 % to the temperatures_train dataset, leaving 20% for the temperatures_test dataset.

  4. Click Run to execute the recipe.

Dataiku screenshot of a Split recipe settings tab.

Train a deep learning model#

Once we have training and testing datasets, we can proceed to create the model.

Create a deep learning task#

We start by creating a prediction task in the Lab — just like any other AutoML model.

  1. From the Flow, select the temperatures_train dataset.

  2. From the Actions tab of the right side panel, click Lab.

  3. Select AutoML Prediction from the menu of Visual ML tasks.

  4. Select target as the feature on which to create the prediction model.

  5. In the Expert section, select Deep Learning.

  6. Click Create.

Dataiku screenshot of the dialog to create a deep learning task.

Handle features with a custom processor#

After creating the task, you’ll encounter an error that no input features have been selected.

  1. Navigate to the Features handling panel.

  2. Confirm both columns date and inputs have been rejected for being unique IDs.

We need a custom processor that unserializes the input string to a vector, and then normalizes the temperature values to be between 0 and 1.

  1. From the top navigation bar, go to the menu Code > Libraries (g + l). (If prompted, it’s safe to leave the page without saving).

  2. Click Add > Create file.

  3. Name it python/windowprocessor.py, and click Create.

  4. Copy-paste the code below.

    import numpy as np
    
    class windowProcessor:
        def __init__(self, window_size):
            self.window_size = window_size
    
        def _convert(self, x):
            m = np.empty((len(x), self.window_size))
            for i in range(len(x)):
                c = np.array(eval(x[i]))
    
                m[i, :] = np.array(eval(x[i]))
            return m
    
        def fit(self, x):
            m = self._convert(x)
            self.min_value, self.max_value = m.min(), m.max()
    
        def transform(self, x):
            m = self._convert(x)
            return (m - self.min_value) / (self.max_value - self.min_value)
    
  5. Click Save All.

Important

The windowprocessor.py file implements the following methods:

Method

Purpose

fit()

Computes the maximum and minimum values of the dataset.

transform()

Normalizes the values to be between 0 and 1.

_convert()

Transforms the data from an array of strings to a 2-dimensional array of floats.

Use the custom preprocessing file#

In the model’s design, we must specify to use the custom processor and tell it that our window has 30 values.

  1. Return to the visual analysis Deep learning for target on temperatures_train.

  2. Go to the Design tab.

  3. Navigate to the Features handling panel.

  4. Turn ON the inputs feature.

  5. Select Text as the variable type.

  6. For text handling, select Custom preprocessing.

  7. Replace the starter code with the snippet below.

    from windowprocessor import windowProcessor
    
    processor = windowProcessor(30)
    
  8. Click Save.

Dataiku screenshot of the features handling panel of a deep learning model.

Important

This custom features handling creates a new input to the deep learning model called inputs_preprocessed. We’ll use that in the specification of the deep learning architecture.

Create the deep learning architecture#

Finally, we need to import the LSTM and Reshape layers in order to specify our architecture. We then have to create our network architecture in the build_model() function. We’ll make no changes to the compile_model() function or the Training panel.

  1. Still in the Design tab, navigate to the Architecture panel.

  2. Replace the first line of code with the following:

    from keras.layers import Input, Dense, LSTM, Reshape
    
  3. Replace the build_model() function with the following:

    def build_model(input_shapes, n_classes=None):
        # This input will receive all the preprocessed features
        window_size = 30
    
        input_main = Input(shape=(window_size,), name="inputs_preprocessed")
    
        x = Reshape((window_size, 1))(input_main)
        x = LSTM(100, return_sequences=True)(x)
        x = LSTM(100, return_sequences=False)(x)
    
        predictions = Dense(1)(x)
    
        # The 'inputs' parameter of your model must contain the
        # full list of inputs used in the architecture
        model = Model(inputs=[input_main], outputs=predictions)
    
        return model
    
  4. Check the Runtime environment panel to ensure you have a code environment that supports visual deep learning.

  5. Click Train.

  6. Click Train again to confirm.

Important

There are three hidden layers. First is a Reshape layer, to convert from a shape of (batch_size, window_size) to (batch_size, window_size, dimension).

Since we only have one input variable at each time step, the dimension is 1. After the reshaping, we can stack two layers of LSTM. The output layer is a fully connected layer, Dense, with one output neuron. By default, its activation function is linear, which is appropriate for a regression problem.

Deploy and apply a deep learning model#

Although this may be a deep learning model, the process for using it is just like that of any other visual model.

Deploy the model#

We first deploy it from the Lab to the Flow as a saved model.

  1. From the Result tab, open the model report.

  2. Click Deploy.

  3. Click Create.

Dataiku screenshot of the dialog for deploying a model.

Apply the model to data#

Let’s now use a Score recipe to apply the model to new data.

  1. From the Flow, select the temperatures_test dataset and the saved model.

  2. From the Actions panel, select the Score recipe.

  3. Click Create Recipe.

    Dataiku screenshot of the dialog to create a Score recipe.
  4. Click Run, and then open the output dataset.

Check the results with a chart#

Lastly, we can check the results with a chart.

  1. From the temperatures_test dataset, navigate to the Charts tab.

  2. From the chart picker, select Lines.

  3. Drag target and prediction to the Y axis.

  4. Drag date to the X axis.

Dataiku screenshot of a line chart with predicted values.

Tip

The line chart shows that the model managed to pick up the general trend. It does not perfectly fit the curve because it is generalizing. The minimum temperature in a country as vast as Australia can fluctuate a lot in a pseudo-random fashion!

What’s next?#

Congratulations! You built a deep learning model on time series data with Dataiku’s visual ML tool. You then deployed it to the Flow and used it for scoring like any other visual model.

Next, you might want to build a deep learning model on structured data by following Tutorial | Deep learning within visual ML.

See also

See the reference documentation on Deep Learning to learn more.