Undeploying LLM
You can undeploy the BMC-provided LLM by using one of the following methods:
- Using the LLM library (UI)—For more information, see Revoking LLMs.
- Manual undeploy—For more information, see Undeploying LLM manually.
Undeploying LLM manually
To completely remove the LLM, you must undeploy it using the script and then delete it from the UI. For more information, see Deleting LLMs.
To undeploy the Llama model on Kubernetes
- Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00/scripts/undeploy_llama.sh.
- Run the undeploy_llama.sh file.
To undeploy the Mixtral model on Kubernetes
- Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00/scripts/undeploy_mixtral.sh.
- Run the undeploy_mixtral.sh file.
To undeploy the Granite model on Kubernetes
- Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00/scripts/undeploy_granite.sh.
- Run the undeploy_granite.sh file.
Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*