System requirements
BMC AMI Platform services requirements
This section describes the different configuration options for installing and deploying BMC AMI Platform.
Configuration option | Supported LLM | Hardware specifications for on-premises machines | AWS | Azure |
---|---|---|---|---|
Recommended |
| GPU Memory: 36 GB GPU: NVIDIA 4 X A10G/1 X A100 | g5.12xlarge (4 X A10G) | Standard_NC24ads_A100_v4 (1 X A100) |
Entry-level |
| GPU Memory: 24 GB GPU: NVIDIA A10G/A100 | g5.4xlarge (1 X A10G) | Standard_NC24ads_A100_v4 (1 X A100) |
Software requirements
Before installing BMC AMI AI Services, make sure that the following software is installed on the computer where you plan to deploy the services:
- Python3 (version 3.00 or later)
- Terraform (version 5.63 or later) for the automated installation of AWS and Azure platforms. For more information, see Automated and manual installations.
You must have a running instance of the following product and feature:
BMC AMI Common Enterprise Services (CES). For more information, see BMC AMI Web Products Installation and Configuration 24.02.
- BMC AMI DevX Code Insights Explain
- BMC AMI Ops Insights
Network port
Configure access to the x86 server and make sure that port 8000 is accessible from outside connections. BMC AMI DevX Workbench for Eclipse and BMC AMI DevX Workbench for VS Code will connect to this port for the Code Insights Explain feature.
Product compatibility matrix