Bringing your own LLM
BYOLLM process
The BYOLLM process consists of adding your LLM to the system via the BMC AMI Manager interface, and then integrating it into the workflow.
The following architecture diagram illustrates how the Bring Your Own LLM process works:

How BYOLLM works
When you connect to our services, the request is routed through the BMC AMI AI Adapter. The BMC AMI AI Adapter serves as middleware that handles communication between you and the LLM.
LLM type detection
The BMC AMI AI Adapter detects whether the added LLM is a BMC-provided, public API, or self-hosted LLM. This is an important step because it determines how the system connects to the LLM. If the LLM is BMC-provided, the system connects the BMC-provided LLM. If the LLM is a public API, the system uses a public API to access it. If it is a self-hosted API, the system establishes a direct connection to the self-hosted instance.
Query run and response
After the connection is established, the BMC AI Adapter passes your query to the LLM. The LLM processes the query based on the model's capabilities and the input provided. After the query runs, the results are returned in the appropriate format.
Key benefits of BYOLLM
- Customizability—You can integrate your own LLM to tailor the service to your specific needs, ensuring better results based on your data and requirements.
- Scalability—Whether using a public API or a self-hosted solution, BYOLLM offers the flexibility to scale based on your resources and usage requirements.
- Seamless Integration—The process is straightforward, so you can quickly set up and begin using your LLM within our platform with minimal configuration. This two-step BYOLLM process ensures that you have full control over the LLM that you use while benefiting from our platform's powerful capabilities.