BMC AMI Assistant chat settings


This section describes how to configure BMC AMI Assistant to specify the Large Language Model (LLM) for generating chat responses. Once configured, the selected model is utilized across all BMC AMI Assistant conversations.

Supported model

InferenceSupported Model
BMC provided LLM

Meta-Llama-3.1-8B-instruct 4K Quantized

Open AI
  • gpt-4o
  • gpt-4o mini
Open AI CompatibleMeta-Llama3.1-8B-instruct
Warning
Important

We support the following inference engines for BYOLLM:

  • OpenAI
  • OpenAI compatible inference engines, such as Azure AI Foundry and AWS Bedrock.
Warning
Important

If no LLM is selected, the BMC-provided Llama 3.1 model will be automatically assigned as the LLM for the BMC AMI Assistant. If you have already configured an LLM, it will remain in place and will not be replaced.

To select the LLM for BMC AMI Assistant chat

  1. Sign in to BMC AMI Platform using your credentials. 
  2. Click Platform manager.
  3. From the menu in the left pane, click BMC AMI AI Manager > BMC AMI Assistant chat settings.
  4. From the BMC AMI Assistant chat LLM drop-down, select the required LLM and select Save. You can only view the LLMs, which are configured for BMC AMI Assistant chat settings.

    1762164128024-625.png

 

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*

BMC AMI Platform 2.0