Managing the platform in the BMC AMI AI Manager console
Before you begin
Before you sign in to the BMC AMI AI Manager console, you must have installed BMC AMI Services. For more information, see Installing.
You can sign in by using the administrator and z/OS PTP credentials, which are defined in the BMC-AMI-AI-Platform-Services.yml file.
- Administrator credentials
- z/OS FTP credentials
For information about the credentials, contact your application developer.
To sign to the BMC AMI AI Manager console
- Log in to the BMC AMI AI Manager console URL as follows:
You can access BMC AMI AI Manager console by using the following steps:- If a load balancer or Application Gateway already exists, create a URL in the following format:
https://domainName/admin, where domainName directs to the load balancer or Application Gateway.
For example, if the domain name is amiai.bmc.com, then the URL is https://amiai.bmc.com/admin. - If you installed BMC AMI AI Services on a stand-alone instance or zCX/zLinux, create a URL in the following format: http://instanceIPaddress:8000/admin. For example, if the IP is 192.168.1.10, then the URL is http://192.168.1.10:8000/admin.
- If a load balancer or Application Gateway already exists, create a URL in the following format:
- Enter your admin user name and password specified in the playbook file.
- Click Sign in.
To select and prepare the BMC feature integration
- In a browser, sign in to BMC AMI AI Manager with your credentials.
- From the menu in the left pane, select AI Services Settings > Feature Settings.
The Feature Settings window lists all the feature integrations available in the deployed BMC AMI AI Services. - From the list in the feature card, select your preferred LLM.
- Click Provision.
After successful provisioning, a Feature provisioned successfully message is displayed, and the Provision button on the card becomes an Update button. - Click Update to change the LLM to any other recommended LLM in the list.
If this procedure fails, the user interface displays an error message with the reason for the failure. For more information, see Troubleshooting.
To view the available LLMs
- Sign in to BMC AMI AI Management Console User Interface with your credentials.
- From the menu in the left pane, select AI Services Settings > LLM Settings.
The Large language model (LLM) settings window is displayed. It lists all the LLMs available in the deployed BMC AMI AI Services.
To download the AI Services log files
You can download log files for active services only.
- Sign in to BMC AMI AI Management Console User Interface with your credentials.
- From the menu in the left pane, select AI Services Settings > AI Services Log Files.
- Select each service in the list to download it. Each log file contains 24-hour log records.
- In the Log file data range dialog box, enter the date and click Download.
Log files are generated in the UTC time zone.
To view the status of BMC AMI AI Services
- Sign in to BMC AMI AI Management Console User Interface with your credentials.
- From the menu in the left pane, click AI Service Health.
- Click a service category name to view information about its status.
The AI Service Health window is displayed. It shows the health status of each service category.
The following table describes the service categories:
Service category | Description |
---|---|
BMC AMI AI Platform Services | Status of the discovery service, gateway service, and security service |
BMC AMI AI Integration Services | Status of the integration services available |
Large Language Models | Status of the deployed LLMs |
Each service displays a summary of the category.
A green check mark confirms that all the services in that category are up and running.
You can expand a service menu to see the service details, which include the following details:
- Name of the service
- Status of the service (Active/Inactive)
- Host name of the container running the service
- Port number of the service
If a service is inactive, the service category status icon displays the number of services that are down.