System requirements


Before you install the product, make sure that your environment meets the hardware and software requirements.

Related topic

BMC AMI Platform services requirements

This section describes the different configuration options for installing and deploying the BMC AMI Platform.

Best practice
Use the recommended configuration option for best performance and LLM outputs. However, the entry-level configuration might be appropriate if you cannot procure a GPU- or CPU-enabled Linux computer.

Configuration option

LLM

Hardware specifications for on-premises machines

AWS

Azure

Recommended

Mixtral8x7b-instruct Quantized

GPU Memory: 36 GB   

GPU: NVIDIA A10G/A100

g5.12xlarge (4 X A10G)

Standard_NC24ads_A100_v4 (1 X A100)
 

Mid-level

Meta-Llama-3-8B-instruct 4K Quantized

GPU Memory: 24 GB

GPU: NVIDIA A10G/A100

g5.4xlarge (4 X A10G)

Standard_NC24ads_A100_v4 (1 X A100)

Entry level

Meta-Llama-3-8B-instruct 4K Quantized

vCPU Count: 32

Memory: 64 GB

c6in.8xlarge

Standard_F32s_v2

Software requirements

Before installing theBMC AMI AI Services, make sure that the following software is installed on the computer where you plan to deploy the services:

  • Python3 (version 3.00 or later)
  • Terraform (version 5.63 or later) for the automated installation of AWS and Azure platforms. For more information, see Automated-and-manual-installations.

You must have a running instance of the following product and feature:

Network port

Configure access to the x86 server and make sure that port 8000 is accessible from outside connections. BMC AMI DevX Workbench for Eclipse and BMC AMI DevX Workbench for VS Code will connect to this port for the Code Insights Explain feature.

Product compatibility matrix

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*