Undeploying LLM


You can undeploy the BMC-provided LLM by using one of the following methods:

Warning
Important

The LLM deployed through the UI must be undeployed using the LLM library (UI), and the LLM deployed manually must be undeployed using the manual method.

Undeploying LLM manually

To completely remove the LLM, you must undeploy it using the script and then delete it from the UI. For more information, see Deleting LLMs.

To undeploy the Llama model on Kubernetes

  1. Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00/scripts/undeploy_llama.sh.
  2. Run the undeploy_llama.sh file. 

To undeploy the Mixtral model on Kubernetes

  1. Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00/scripts/undeploy_mixtral.sh.
  2. Run the undeploy_mixtral.sh file.

To undeploy the Granite model on Kubernetes

  1. Log in to the primary manager node of Kubernetes and access /<extracted_dir>/BMC-AMI-PLATFORM-2.0.00/scripts/undeploy_granite.sh.
  2. Run the undeploy_granite.sh file.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*

BMC AMI Platform 2.0