1. Amazon Forecast and Lambda

Navigate to the Lambda service page in the AWS management console.

  1. Log into your AWS management console.
  2. In the Find Services field type Lambda, and go to its service page.

Lambdas used for Forecast automation

You may notice a collective list of Lambda functions in the AWS console. We will examine the four functions related to this lab. Please click on each one below to go through their steps and see how they operate in the Forecast workflow.

TrainPredictorLambda

The TrainPredictorLambda is associated with a Custom Resource in the EBS Limits Forecast Cloudformation template (this template has already been lauched for you prior to this workshop). More specifically, in a production environment, the lambda function will automate the creation of a new Forecast Dataset group, import the first Dataset, and then train a new Predictor. This initial setup creates the basic framework needed to Forecast your EBS usage on a weekly basis.

Due to time constraints of this workshop, the Cloudformation template was pre-deployed in dry-run mode so that the TrainPredictorLambda goes beyond provisioning the three Forecast components mentioned above, and it also creates a Forecast and Forecast Export Job as well. This gives us a completed EBS limits Forecast workflow to examine without having to wait for its full runtime to finish (approximately 45 minutes).

Let’s take a look at the manual steps that TrainPredictorLambda automates.

  1. Click on the Services dropdown at the very top of the AWS Management console

  2. Find the service named Amazon Forecast and go to its service page.
    (NOTE: If you do not see a Dataset group immediately listed, click on View dataset groups)

  3. The first Forecast component that TrainPredictorLambda automates is the creation of a new Dataset group. To do this step manually:

    • Click Create dataset group
    • Provide a unique name in the Dataset group name field
    • For Forecast domain choose Custom
    • Click Next
  4. The creation of the Dataset group takes you to the next page, the Dataset. In this step, a historical data file (what would be your past daily EBS count) is imported into Forecast:

    • Provide a unique name in the Dataset name field
    • In the Dataset schema field, add an additional string attribute named “region” to the schema so it looks like the following:

      {
          "Attributes": [
            {
              "AttributeName": "item_id",
              "AttributeType": "string"
            },
            {
              "AttributeName": "timestamp",
              "AttributeType": "timestamp"
            },
            {
              "AttributeName": "target_value",
              "AttributeType": "float"
            },
            {
              "AttributeName": "region",
              "AttributeType": "string"
            }
          ]
      }
      
    • Click Next

    • In the Dataset import name field provide a unique name.

    • Remove the HH:mm:ss from the timestamp format, so that it only reads “yyyy-MM-dd

    • Next, you would normally provide a S3 path to your historical, time-series .csv file which includes your daily EBS count. However, because this import is being automated by TrainPredictorLambda, you can just click Cancel.

Up to this point we have learned how you can create a Dataset group and Dataset. Let’s take a look now at the Predictor, Forecast and Forecast Export which TrainPredictorLambda automates as well, specifically in dry-run mode.

  1. Click on the Dataset group named ebs_limits_group.

  2. Click Predictors on the left side of Forecast console. Here is where you could use the previously created Dataset import to train a new Predictor:

  3. Click on Train new predictor

    • Provide an unique name in the Predictor name field
    • For Algorithm selection select Manual
    • In the Algorithm dropdown, choose ARIMA
    • In the Forecast dimensions - optional dropdown, choose region
    • Click Cancel since this step is being automated by TrainPredictorLambda
  4. Click on Forecasts on the left side of the console. This is where the REAL magic happens, and you can find out what your projected EBS usage will be for the next 10 days! To perform this step manually:

    • Click Create a forecast
    • Provide an unique name in the Forecast name field.
    • In the Predictor dropdown, you would normally select the predictor you trained in the previous step. Click Cancel for now.

Finally, you can examine the final product of Forecast which is called a Forecast export:

  1. Click on the Forecast named ebs_limits_forecast to examine its Forecast export(s):

  2. Click Create forecast export

    • Provide an unique name in the Export name field
    • In the S3 forecast export location you would normally provide the output S3 bucket/key that you wanted to export your forecast to.
    • Click Cancel for now

DailyEbsVolumeCountLambda

The DailyEbsVolumeCountLambda is invoked by a Cloudwatch daily event which we’ll configure later in this lab. The function is responsible for tallying the count of your EBS volumes by type and region. It will append this count to the historical_data.csv file (or whatever you have configured the name of the file to be in the Cloudformation parameters).

Add the python3.7 code to this lambda function:

  1. Click on the Services dropdown at the very top of the AWS Management console
  2. Find the Lambda service and go to its page
  3. Click on the DailyEbsVolumeCountLambda from the list of functions
  4. Scroll down to the Function code section
  5. Find the dropdown Code entry type and select Upload a .zip file
  6. Click Upload and upload the following zip package: daily_ebs_volume_count.zip
  7. In the Handler field, change index.lambda_handler to daily_ebs_volume_count.lambda_handler. This points Lambda to the module you just uploaded
  8. At the top-left corner of the console click Save

Once you upload the zip and the code is opened in the editor, you will notice the commented section at the top of the file. This gives a much more detailed explanation of how the function works. One note to add is that there are two other additional modules shipped with the zip: get_ebs_count_via_describe.py and get_ebs_count_via_ta.py.

These are two modules that you can optionally import into daily_ebs_volume_count.py to get daily EBS volume counts, depending on your AWS support level. If you are on Business or Enterprise support level, the preferred import is:

import get_ebs_count_via_ta as get_ebs_count

for any other support level…

import get_ebs_count_via_describe as get_ebs_count

WeeklyForecastHandlerLambda

The WeeklyForecastHandlerLambda is responsible for processing historical_data.csv on a weekly basis in Forecast to get the next 10-day projection of usage, and make appropriate limit increase requests. It essentially automates weekly Forecast Dataset imports and subsequent Forecasts/Exports to your output S3 bucket.

Add the python3.7 code to this lambda function:

  1. Click on the Functions backlink at the top of the page
  2. Click on the WeeklyForecastHandlerLambda from the list of functions
  3. Copy the contents of weekly_forecast_handler.py, and paste them into the code editor in the AWS management console.
  4. Click Save at the top of the console.

The comments at the top of the code give a more detailed explanation of how the lambda functions in the lab workflow.

EbsLimitIncreaseRequestorLambda

The EbsLimitIncreaseRequestorLambda is responsible for parsing the final Forecast export sent to the output S3 bucket. In a production environment, it will compare the forecast projection against the current account limits for EBS, make new limit increase requests accordingly, and then finally generate a report to send to SNS email subscribers. For this lab, since the solution was deployed in dry-run mode, the lambda will NOT actually make limit increase requests. Instead it will just generate the email report based on dummy data. You will see how this works later in the lab.

Add the python3.7 code to this lambda function:

  1. Click on the Functions backlink at the top of the page
  2. Click on the EbsLimitIncreaseRequestorLambda from the list of functions
  3. Copy the contents of ebs_limit_increase_requestor.py, and paste them into the code editor in the AWS management console.
  4. Click Save at the top of the console.

The comments at the top of the code give a more detailed explanation of how the lambda functions in the lab workflow.