Deploying BMC AMI Ops Insight Explain Probable Cause


You must install BMC AMI Platform before deploying the BMC AMI Ops Insight Explain Probable Cause service.

Success

Best practice
We recommend the Llama-3.1-8B-Instruct model for this integration. 

Warning
Important

We support the following inference engines for BYOLLM:

  • OpenAI
  • vLLM
  • Triton

To deploy BMC AMI Ops Insight Explain Probable Cause on Kubernetes

  1. Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00.
  2. Verify that the scripts/oi-rc.sh file is present.
  3. Run the oi-rc.sh file.

To verify the deployment

  1. Verify that the service and pod are running successfully under the namespaces by using the following commands:

     kubectl get po --namespace bmcami-prod-amiai-services

    1760613655305-209.png

     kubectl get svc --namespace bmcami-prod-amiai-services
     

    1760613974316-750.png

To use BMC AMI Ops Insight Explain Probable Cause, you must provision the service and download the connection details from Platform Manager > BMC AMI AI Manager> AI services settings > Integrations settings page. 

To learn how to integrate, see Integrations Settings.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*

BMC AMI Platform 2.0