Running a time forecasting model scenario


This topic describes how to run a time forecasting model scenario, which is part of Working-with-time-forecasting-models.

Before you begin

Before you can run a time forecasting model scenario, you must have performed the following tasks:

To run a time forecasting model scenario

On a time forecasting model scenario detail page, click Run forecast.

The following information about the forecast executed on each time series is displayed in the Execution summary:

  • System: The name of the system.
  • An Indicator depicting the reliability of the forecast. It is represented by one of the following icons:

    • accuracy_good.png: The forecast is reliable.
    • accuracy_warning.png: The forecast might be inaccurate.
    • accuracy_poor.png: The forecast is not reliable.

    You can use this reliability indicator to determine the reliability of a prediction computed for a specific forecasting algorithm. It is strongly related to a performance index (the accuracy index) and to other features of time series. It is computed by evaluating the following aspects of the forecast:

    • Accuracy index
    • Data length
    • Regime change: A regime change is a time range in data with a different behavior with respect to time gone past. A typical case can be a series affected by a sudden change, like a step in the forecast. For example, consider a series expressed as a percentage that increases or decreases the overall capacity (denominator in the percentage calculation). This generates a step in the forecast even if the absolute usage remains constant.

      As an illustration, consider a server that has a constant daily utilization percentage, say 60%. Doubling the number of CPUs, the utilization percentage will decrease by half, and so the new regime is 30%. In a business driver scenario, a regime change can be the release of a new service, (increasing the number of user transactions) or an enlarged customer base (a bulk addition of user accessing to a service).

    The value of the Accuracy index generally determines the value of the evaluator. If the time series is short, or a Regime change occurred in the validation set, the forecast reliability could be compromised. In this case, the reliability evaluator can only be in a Warning or Poor state, depending on the Accuracy index.

    Note

    Unlike the Accuracy index, Data length and Regime changes do not depend on the forecasting algorithm, and rely only on input data.

  • Forecast end: The last time stamp of the forecast.
  • Value at end: The value of the forecast at Forecast end.
  • Forecasted peak value: The highest value of the forecast, which could be equal to the Value at end.
  • RMSE (Root Mean Square Error): The average expected error of the prediction. It is a reliability measure calculated on a specific forecast made using a certain forecasting algorithm (that is, for the same data, using different prediction algorithms may lead to different reliability outputs).
    RMSE is computed on data, and can be seen as the expected error made by the forecast in predicting the given data. This performance measure has the same unit of the data and is not biased by the length of data. 
  • An Indicator depicting threshold violations for the forecast. It is represented by one of the following color-coded icons:
    • dot_green.png: The threshold is not violated in the forecast.
    • dot_orange.png: The forecast approaches the threshold for the forecast.
    • dot_red.png: The threshold is violated in the forecast.
    • dot_grey.png: There are no thresholds set for this forecast.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*