Managing the platform in the BMC AMI AI Manager console


This topic describes how to use the BMC AMI AI Manager console to manage your services. By using the console, you can prepare different features and check the deployed status of BMC AMI AI Services, large language models (LLMs), and their details in one place.


Before you begin

Before you sign in to the BMC AMI AI Manager console, you must have installed BMC AMI Services. For more information, see Installing.

You can sign in by using the administrator and z/OS PTP credentials, which are defined in the BMC-AMI-AI-Platform-Services.yml file.

For information about the credentials, contact your application developer.

To sign to the BMC AMI AI Manager console

  1. Access the BMC AMI AI Manager console as follows:
    • If a load balancer or Application Gateway already exists, create a URL in the following format:
      https://domainName/admin, where domainName directs to the load balancer or Application Gateway.
      For example, if the domain name is amiai.bmc.com, then the URL is https://amiai.bmc.com/admin.
    • If you installed BMC AMI AI Services on a stand-alone instance or zCX/zLinux, create a URL in the following format: http://instanceIPaddress:8000/admin. For example, if the IP is 192.168.1.10, then the URL is http://192.168.1.10:8000/admin.
  2. Enter your admin user name and password specified in the playbook file.
  3. Click Sign in.

To select and prepare the integrations settings

  1. In a browser, sign in to BMC AMI AI Manager with your credentials. 
  2. From the menu in the left pane, select AI Services Settings > Integrations Settings.
    image-2024-12-19_14-45-12.png
    The Integrations Settings window lists all the integrations available in the deployed BMC AMI AI Services.
  3. From the list in the integration card, select your preferred LLM. 
  4. Click Provision.
    After successful provisioning, a Integraion provisioned successfully message is displayed, and the Provision button on the card becomes an Update button. 
  5. Click Update to change the LLM to any other recommended LLM in the list. 

If this procedure fails, the user interface displays an error message with the reason for the failure. For more information, see Troubleshooting.

To view the available LLMs

  1. Sign in to BMC AMI AI Management Console User Interface with your credentials. 
  2. From the menu in the left pane, select AI Services Settings > LLM Settings.
    image-2024-12-23_12-22-26.png

To add a local model

  1. On the upper right of the LLM settings window, click Add local model.
  2. In the Add local large language model dialog box, select your model type.
  3. Select either a public API or a self-hosted model type: 

    Follow these steps:

    1. Select the hosted platform. 
      image-2024-12-25_17-43-24.png
    2. Enter an OpenAI API access key
    3. Click Test connection.
    4. If a success confirmation message appears, click Next>>. (If a fail message appears, verify that the provided API access key is valid.)

    Follow these steps:

    1. Complete the following fields:

      Field

      Description

      Base URL 

      protocol://hostName:port/v1 

      protocol=http or https

      The placeholders in the URL are defined as follows:

      • hostName—is the IP or Host name of the machine to access the on-premise LLM server. 
      • port—is the port where the LLM server is exposed.
      • v1—The version of the OpenAI API specification.

      API access key 

      (Optional) Key to access the LLM server API if configured in the LLM server 

      image-2025-1-28_14-6-24.png

    2. Click Test connection.
    3. If a success confirmation message appears, click Next>>. (If a fail message appears, verify that the provided details are valid.)

  4. On the Step 2: Add details and save tab, add the on-premises LLM server details to complete the following fields as follows:

    Field

    Description

    Maximum number of tokens

    Non-zero positive integer representing the maximum number of tokens configured for the model on the on-premises LLM server.

    Model

    None

    Version

    None

    Supported integrations

    None

    Display name 

    (Optional) Model in BMC AMI AI Management Console  

    Description 

    Description of the LLM

    Add local LLM.png

  5. Click Save.

A success confirmation message is displayed.

If this procedure fails, the user interface displays an error message with the reason for the failure. For more information, see Troubleshooting.

To download the AI Services log files

You can download log files for active services only.

  1. Sign in to BMC AMI AI Management Console User Interface with your credentials. 
  2. From the menu in the left pane, select AI Services Settings > AI Services Log Files.
  3. Select each service in the list to download it. Each log file contains 24-hour log records.
    AI Services Log Files screenshot.png
  4. In the Log file data range dialog box, enter the date and click Download.
    image-2024-12-23_12-24-40.png

Log files are generated in the UTC time zone.

To view the status of BMC AMI AI Services

  1. Sign in to BMC AMI AI Management Console User Interface with your credentials. 
  2. From the menu in the left pane, click AI Service Health.
  3. Click a service category name to view information about its status.

The AI Service Health window is displayed. It shows the health status of each service category.
image-2024-9-9_14-58-8.png

The following table describes the service categories: 

Service category

Description

BMC AMI AI Platform Services

Status of the discovery service, gateway service, and security service

BMC AMI AI Integration Services

Status of the integration services available

Large Language Models

Status of the deployed LLMs

Each service displays a summary of the category. 

A green check mark confirms that all the services in that category are up and running.
image-2024-9-10_10-7-9.png

To the service details, including the following details, expand a service menu:

  • Name of the service
  • Status of the service (Active/Inactive)
  • Host name of the container running the service
  • Port number of the service

If a service is inactive, the service category status icon displays the number of services that are inactive:
image-2024-9-10_10-34-13.png





 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*