BMC AMI Assistant chat settings
This section describes how to configure BMC AMI Assistant to specify the Large Language Model (LLM) for generating chat responses. Once configured, the selected model is utilized across all BMC AMI Assistant conversations.
Supported model
| Inference | Supported Model |
|---|---|
| BMC provided LLM | Meta-Llama-3.1-8B-instruct 4K Quantized |
| Open AI |
|
| Open AI Compatible | Meta-Llama3.1-8B-instruct |
To select the LLM for BMC AMI Assistant chat
- Sign in to BMC AMI Platform using your credentials.
- Click Platform manager.
- From the menu in the left pane, click BMC AMI AI Manager > BMC AMI Assistant chat settings.
- From the BMC AMI Assistant chat LLM drop-down, select the required LLM and select Save. You can only view the LLMs, which are configured for BMC AMI Assistant chat settings.

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*