Bringing your own LLM


This topic describes how you can use Bring Your Own LLM (BYOLLM) to integrate a large language model (LLM) hosted in your environment into BMC AMI Platform, whether self-hosted via vLLM or provided via the OpenAI service. You can map any integration to your added LLM and use it within BMC AMI Platform. Currently, only vLLM-based LLMs and OpenAI services are supported.

This feature enhances flexibility and customization. For more information, see Managing the Platform in the BMC AMI AI Manager Console.

BYOLLM process

The BYOLLM process consists of adding your LLM to the system via the BMC AMI Manager interface, and then integrating it into the workflow.

The following architecture diagram illustrates how the Bring Your Own LLM process works:

image-2025-2-4_8-32-0.png

How BYOLLM works

When you connect to our services, the request is routed through the BMC AMI AI Adapter. The BMC AMI AI Adapter serves as middleware that handles communication between you and the LLM. 

LLM type detection

The BMC AMI AI Adapter detects whether the added LLM is a BMC-provided, public API, or self-hosted LLM. This is an important step because it determines how the system connects to the LLM. If the LLM is BMC-provided, the system connects the BMC-provided LLM. If the LLM is a public API, the system uses a public API to access it. If it is a self-hosted API, the system establishes a direct connection to the self-hosted instance. 

Warning

Important

  • For self-hosted setups, BMC AMI Platform supports vLLM versions ranging from 0.5.5 to 0.6.4.post1. 
  • For public API, BMC AMI Platform supports OpenAI.

Query run and response

After the connection is established, the BMC AI Adapter passes your query to the LLM. The LLM processes the query based on the model's capabilities and the input provided. After the query runs, the results are returned in the appropriate format.

Warning

Important

You cannot currently change the default prompt, so the connected model always runs with the default prompt provided for the functionality. 

Key benefits of BYOLLM

  • Customizability—You can integrate your own LLM to tailor the service to your specific needs, ensuring better results based on your data and requirements. 
  • Scalability—Whether using a public API or a self-hosted solution, BYOLLM offers the flexibility to scale based on your resources and usage requirements. 
  • Seamless Integration—The process is straightforward, so you can quickly set up and begin using your LLM within our platform with minimal configuration. This two-step BYOLLM process ensures that you have full control over the LLM that you use while benefiting from our platform's powerful capabilities. 


 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*

BMC AMI Platform 1.6