Integrating


You can integrate BMC AMI AI Services with BMC AMI DevX Code Insights Explain and BMC AMI Ops Insights Root Cause Explain products.

BMC AMI DevX Code Insights Explain integration

The following table describes the installation process:

Task

Action

Reference

1

Plan your deployment architecture.

2

Run the installation process.

3

Manage the product after installation. 

Best practice
We recommend the following models:

Language

Recommended model

Limitations

Cobol

Mixtral8x7B-instruct Quantized

None

PLI

Mixtral8x7B-instruct Quantized

None

Assembler

  • Mixtral8x7B-instruct Quantized
  • Granite-3.0-8B-Instruct
  • Irrelevant inline comments can reduce the quality of code explanations.
  • In the code, anything beyond 72 offsets is included in the code explanation.
  • Macros are not supported.

JCL

Mixtral8x7B-instruct Quantized

  • JCL support is currently limited by not having insight into what is in the procs. JCL support is under development and, when it is released, the results should improve.
  • The results from the Llama 3 and Granite 3 models for JCL are unacceptable, so we do not recommend these models. We recommend Mistral.
  • We are finding an inconsistent response for the Cond parameter. We plan to provide better results in future maintenance.
  • Conditions (If-Else) are sometimes not detected correctly or give incorrect explanations. We plan to provide better results in future maintenance.
  • The summary is quite short for longer JCL jobs. We plan to add the ability to resolve the procs. This should improve this in future maintenance.

BMC AMI Ops Insights Root Cause Explain integration

BMC AMI Platform uses generative AI to solve challenging problems, bridge knowledge gaps, and accelerate innovation. It is currently activated for the Probable Cause Analysis (PCA) feature. It explains the selected event classification path.

The following table describes the installation process:

Task

Action

Reference

1

Plan your deployment architecture.

2

Run the installation process.

3

Manage the product after installation. 

4

Configure the integration

Best practice
We recommend Meta-Llama-3-8B-instruct 4K Quantized (GPU) and Mixtral8x7B-instruct Quantized models for this integration. 

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*