System requirements
BMC AMI Platform services requirements
This section describes the different configuration options for installing and deploying the BMC AMI Platform.
Configuration option | LLM | Hardware specifications for on-premises machines | AWS | Azure |
---|---|---|---|---|
Recommended | Mixtral8x7b-instruct Quantized | GPU Memory: 36 GB GPU: NVIDIA A10G/A100 | g5.12xlarge (4 X A10G) | Standard_NC24ads_A100_v4 (1 X A100) |
Mid-level | Meta-Llama-3-8B-instruct 4K Quantized | GPU Memory: 24 GB GPU: NVIDIA A10G/A100 | g5.4xlarge (4 X A10G) | Standard_NC24ads_A100_v4 (1 X A100) |
Entry level | Meta-Llama-3-8B-instruct 4K Quantized | vCPU Count: 32 Memory: 64 GB | c6in.8xlarge | Standard_F32s_v2 |
Software requirements
Before installing theBMC AMI AI Services, make sure that the following software is installed on the computer where you plan to deploy the services:
- Python3 (version 3.00 or later)
- Terraform (version 5.63 or later) for the automated installation of AWS and Azure platforms. For more information, see Automated-and-manual-installations.
You must have a running instance of the following product and feature:
BMC AMI Common Enterprise Services (CES). For more information, seeBMC AMI Web Products Installation and Configuration 24.01.
- BMC AMI DevX Code Insights Explain
Network port
Configure access to the x86 server and make sure that port 8000 is accessible from outside connections. BMC AMI DevX Workbench for Eclipse and BMC AMI DevX Workbench for VS Code will connect to this port for the Code Insights Explain feature.
Product compatibility matrix