BMC AMI AI Services overview


The BMC AMI AI Services provides you with a common framework for delivering Generative AI (Gen AI) solutions. It offers an integration service framework that streamlines the development process for all AI use cases. The platform's management console UI enables service health checks, Large Language Model (LLM) configuration, and feature provisioning, while its containerized architecture ensures flexible deployment options. The platform also includes an evaluation framework to validate LLM outputs during development and is designed with the flexibility to incorporate future advances in Gen AI technologies.


Architecture diagram and workflow for integrations

BMC AMI AI Services use microservice architecture to provide flexibility and ease of integrating Gen AI solutions into the existing BMC AMI products.

The following architecture diagram shows the integration of the Code Insights Explain feature into the BMC AMI DevX Workbench product. It also provides an overview of different services that run within the platform, with some of them categorized as internal services because they are not directly approachable from outside of the host server. All the services communicate with each other by using the REST APIs, which are protected by industry-standard protocols and JWT tokens. The Gateway service is the entry point for all the features into BMC AMI AI Services.

This diagram also shows the flexibility in deploying the BMC AMI AI Services. You can deploy these services on a different platform from that of the LLM. You can deploy the LLM either in a cloud environment of your choice or on a machine present in your own organization (with the necessary system requirements). In future releases, we plan to support a mechanism by which you can connect any LLM running within your organization to the BMC AMI AI Services.

BMC AMI AI Services flow.png

Service descriptions

The following table describes the services offered by the BMC AMI AI Services platform:

Service

Description

Gateway 

The Gateway Service is the entry point to the BMC AMI AI Services platform. Acting as a proxy, it routes incoming requests to the appropriate feature-specific integration services.

This service also hosts the AMI AI Services Management Console, which enables feature provisioning. 

Discovery

The Discovery Service monitors the status of all services within the BMC AMI platform. Each service sends periodic heartbeats, allowing the Discovery Service to track its availability and current operational status (whether it's up or down). When a microservice needs to communicate with another service, it retrieves the corresponding service address from the Discovery Service. 

Security 

The Security Service acts as a protective layer for the BMC AMI AI Services platform. It prevents prompt injection attacks and filters out denylisted words. With an allowlist filter in place, only a specified range of unified characters is allowed, safeguarding against embedded or unwanted characters.

You can input and output anomaly detectors to the security layer to identify abnormal behaviors.

This service also includes a provision for enabling a rate limiter to manage traffic efficiently. 

LLM 

The LLM Service is the AI engine of the BMC AMI platform. It is responsible for configuring and managing the LLM used within the platform. 

Integration 

The Integration Service is responsible for handling the core business logic. It performs use-case-specific tasks, including managing large inputs, employing task-specific prompts, and executing data pre- and post-processing. It intercommunicates with the security and LLM service to prepare the response for user queries. 

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*