Managing the platform in the BMC AMI AI Manager console
Before you begin
Before you sign in to the BMC AMI AI Manager console, you must have installed BMC AMI Services. For more information, see Installing.
You can sign in by using the administrator and z/OS PTP credentials, which are defined in the BMC-AMI-AI-Platform-Services.yml file.
For information about the credentials, contact your application developer.
To sign to the BMC AMI AI Manager console
- Access the BMC AMI AI Manager console as follows:
- If a load balancer or Application Gateway already exists, create a URL in the following format:
https://domainName/admin, where domainName directs to the load balancer or Application Gateway.
For example, if the domain name is amiai.bmc.com, then the URL is https://amiai.bmc.com/admin. - If you installed BMC AMI AI Services on a stand-alone instance or zCX/zLinux, create a URL in the following format: http://instanceIPaddress:8000/admin. For example, if the IP is 192.168.1.10, then the URL is http://192.168.1.10:8000/admin.
- If a load balancer or Application Gateway already exists, create a URL in the following format:
- Enter your admin user name and password specified in the playbook file.
- Click Sign in.
To select and prepare the integrations settings
- In a browser, sign in to BMC AMI AI Manager with your credentials.
- From the menu in the left pane, select AI Services Settings > Integrations Settings.
The Integrations Settings window lists all the integrations available in the deployed BMC AMI AI Services. - From the list in the integration card, select your preferred LLM.
- Click Provision.
After successful provisioning, a Integration provisioned successfully message is displayed and the Provision button on the card becomes an Update button. - Click Update to change the LLM to any other recommended LLM in the list.
To easily export and save the integration’s connection details, click Download Connection Details.
The Download Connection Details link is displayed only when CES is not required for the integration.Clicking this link downloads a file containing the following connection details:
Field
Description
integration_name
Name of the integration
integration_path
API path used to connect to the integration
integration_key
Integration key used for authentication
integration_id
Unique ID of the integration
If this procedure fails, the user interface displays an error message with the reason for the failure. For more information, see Troubleshooting.
To add new integrations
- On the upper left of the Integrations Settings window, click
.
To create an integration, add your details to complete the following fields as follows:
Field
Description
Integration name
Integration name not exceeding 46 characters. Numbers, spaces, and hyphens are allowed.
Product family
Product family not exceeding 50 characters. Numbers, spaces, and hyphens are allowed.
Connected product
Connected product name not exceeding 58 characters. Numbers, spaces, and hyphens are allowed.
Description
Description not exceeding 255 characters. Numbers, spaces, and special characters . , ' \" are allowed.
Supported LLM
List of LLMs that are available for integration
- Click Save.
A success confirmation message is displayed.
If this procedure fails, the user interface displays an error message with the reason for the failure. For more information, see Troubleshooting.
To view the available LLMs
- Sign in to BMC AMI AI Management Console User Interface with your credentials.
- From the menu in the left pane, select AI Services Settings > LLM Settings.
To add a local model
- On the upper left of the LLM settings window, click Add local model.
- In the Add local large language model dialog box, select your model type.
Select the public API or the self-hosted model type:
Follow these steps:
- Select the hosted platform.
- Enter an OpenAI API access key
- Click Test connection.
- If a success confirmation message is displayed, click Next>>. (If a failure message is displayed, verify that the provided API access key is valid.)
Follow these steps:
Complete the following fields:
Field
Description
Base URL
protocol://hostName:port/v1
protocol=http or https
The placeholders in the URL are defined as follows:
- hostName—IP or host name of the machine to access the on-premises LLM server
- port—Port where the LLM server is exposed
- v1—Version of the OpenAI API specification
API access key
(Optional) Key to access the LLM server API if configured in the LLM server
- Click Test connection.
- If a success confirmation message is displayed, click Next>>. (If a failure message is displayed, verify that the provided details are valid.)
- Select the hosted platform.
On the Step 2: Add details and save tab, add the on-premises LLM server details to complete the following fields as follows:
Field
Description
Maximum number of tokens
Non-zero positive integer representing the maximum number of tokens configured for the model on the on-premises LLM server
Model
You can select a model from the Model details list.
Version
None
Supported integrations
None
Display name
(Optional) Model in BMC AMI AI Management Console
Description
Description of the LLM
- Click Save.
A success confirmation message is displayed.
If this procedure fails, the user interface displays an error message with the reason for the failure. For more information, see Troubleshooting.
To download the AI Services log files
You can download log files for active services only.
- Sign in to BMC AMI AI Management Console User Interface with your credentials.
- From the menu in the left pane, select AI Services Settings > AI Services Log Files.
- Select each service in the list to download it. Each log file contains 24-hour log records.
- In the Log file data range dialog box, enter the date and click Download.
Log files are generated in the UTC time zone.
To view the status of BMC AMI AI Services
- Sign in to BMC AMI AI Management Console User Interface with your credentials.
- From the menu in the left pane, click AI Service Health.
- Click a service category name to view information about its status.
The AI Service Health window is displayed. It shows the health status of each service category.
The following table describes the service categories:
Service category | Description |
---|---|
BMC AMI AI Platform Services | Status of the discovery service, gateway service, and security service |
BMC AMI AI Integration Services | Status of the integration services available |
Large Language Models | Status of the deployed LLMs |
Each service displays a summary of the category.
A green check mark confirms that all the services in that category are up and running.
To view the service details, including the following details, expand a service menu:
- Name of the service
- Status of the service (Active/Inactive)
- Host name of the container running the service
- Port number of the service
If a service is inactive, the service category status icon displays the number of services that are inactive: