Creating and managing prompts


 

A prompt is an input or query that guides the system to generate relevant responses or perform specific actions. Prompts enhance the accuracy of the response by meeting specific user needs.

BMC HelixGPT uses English, United States (US) as the default language to respond to user questions unless explicitly stated in the prompt instructions. Also, it translates only the extracted summary from the source before displaying the answers to the end users and does not change the language of the source content. 

Best practices for generating prompts

Use the following best practices for generating prompts in BMC HelixGPT:

  • Limit the topics for your domain. For example, create prompts specific to the IT or HR domain.
  • Use a friendly, helpful, and formal tone of response.
  • Prompt HelixGPT to ask for a confirmation when information is asked for entities, such as name, date, and location in long prompts.
  • Display the knowledge article summaries as paragraphs, bullet points, or numbered lists.
  • Use headlines or sections for long answers.
  • Do not create jokes.

Before you begin

Make sure you have added the API Endpoint URL and API keys for the generative AI provider you plan to use.
For more information, see Provisioning and setting up the generative AI provider for your application.


To create prompts for a skill

You can create prompts for a skill when you select the instruction type as prompts while creating a custom skill as shown in the following screenshot:

25_2_Create_Prompt.png

Perform the following steps:

  1. In the BMC HelixGPT Manager, select the application and the skill for which you want to create a prompt. 
  2. (Optional) To edit the global prompt, click Edit global prompt and type your text, as shown in the following screenshot:
    25_2_Edit_GP.png
    The global prompt is always appended to the skill-specific prompts.
  3. To add a skill-specific prompt, select the skill and then click the Prompts tab as shown in the following screemshot: 
    25_2_AddPrompt.png
  4. Click Add prompt
    The Add prompt dialog box is displayed.
  5. In the Add prompt dialog box, specify the following fields:

    Field nameDescriptionScreenshot
    General tab
    NameSpecify a name that uniquely identifies the prompt.
    For example: Employee_Payroll prompt.
    25_2_General.png
    TypeSelect the type of prompt.
    For example: Knowledge
    Learn about the different types of prompts in Prompts.
    Skill nameSkill name is populated by default.
    Starter prompt (one per Skill)If the skill has multiple prompts, select this check box to set the prompt as the first prompt that is sent to the AI provider.
    Model tab
    Use model selected for skill

    Turn on the Use model selected for the skill toggle key if you want to use the same Provider and Model that are already selected for the skill. The values are populated automatically. 

    Turn off the Use model selected for the skill toggle key, if you want to use different Provider and Model than the ones selected for the skill.

    25_2_Model.png
    Provider Provider is populated by default.25_2_Provider.png
    ModelModel is populated by default.

    Parameter setting tab

    The parameters set at the prompt level override the parameters set at the skill level.
    TemperatureAffects the randomness of the results. A higher temperature leads to more varied and unpredictable output. The value ranges from 0.0 to 1.0, with a default value of 1.0.25_2_Parameter.png 
    Top-KSelects results from the top k most likely tokens. Lower k values focus on only the most predictable choice of tokens. The value ranges from 0 and 40, with a default value of 0.
    Top-PNarrows choices to the most likely tokens. Lower values create a smaller, more focused selection. The value ranges from 0.0 and 1.0, with a default value of 1.0.
    Metadata tab
    Metadata

    Metadata is the relevant data for the given prompt. It helps the service prompt to have information about the BMC Helix Digital Workplace Catalog service, which is used to submit the BMC Helix Digital Workplace request.

    For the catalog service, you can add the question type for that prompt in the metadata. The Live Chat prompt also has metadata about the prompt type.

    25_2_Metadata.png
    Prompt tab
    Prompt

    Specify a detailed prompt to generate answers in a specific format. For example:

    Answer the question based on the context below. Summarize the answer but prefer clarity over brevity.
    {global_prompt} 

    If instructional steps are required, use a numbered list. 
    Create a final answer by using the references ("SOURCES") and the Conversation History.
    Don't try to make up an answer. Do not hallucinate.
    ALWAYS return a "SOURCES" part in your answer.

    Conversation History: {history}

    QUESTION: {input}
    =========
    SOURCES: {summaries}
    =========
    FINAL ANSWER:

    25_2_Promptcode.png

     

  6. Click Create.

Repeat the steps to add multiple prompts. 


To version a prompt

Versioning prompts in BMC HelixGPT enhances prompt management by enabling customers to create, save, and switch between different versions for various use cases. This feature helps retain changes for future reference, enhancing efficiency.

Consider the following points when you version a prompt:

  • You can version only the custom prompts.
  • If you want to version an out-of-the-box prompt, create a copy of that prompt and then version the copied prompt. 
  • When you create a new version of an existing prompt, the system automatically increments the version number from the latest available one. For example, if you have 4 prompt versions and want to add a prompt version to v2, the new prompt version is v5, not v3.
  • By default, when a new prompt version is added to a skill, it automatically becomes the active version for that skill.
  • When you rename a prompt, the system also automatically renames all its prompt versions.
  • When you publish a service prompt such as Guest Wi-Fi, it is added only to the active version of the router prompt in the target skill. If deleted, the entry is removed only from the active version of the router prompt.

Perform the following steps:

  1. In HelixGPT  Manager, select the application and the skill for which you want to version a prompt.
  2. On the Prompts tab, from the list of prompts, select the prompt for which you want to create the version.
  3. Click Add Version.
    Enter the required details.
  4. Click Create.
    The following screenshot shows an example of options available for a prompt version:
    25_2_PromptVersion.png

To activate a prompt version

When you add a new version, it automatically becomes the active version. To change the active version:

  1. In HelixGPT Manager, select the application and the skill for which you want to activate the prompt version.
  2. On the Prompts tab, from the list of prompts, select the prompt for which you want to activate the prompt version.
  3. Select the appropriate version and click Activate as shown in the following screenshot:
    25_2_Activate.png

To copy a prompt version

You can copy a prompt version from one skill to another from any application.  However, you cannot copy a prompt to an out-of-the-box (seeded) skill. The version number of a copied version always starts at 1, no matter which version it is copied from.

  1. In HelixGPT Manager, select the application and the skill from which you want to copy a prompt version.
  2. Select the prompt version you want to copy and click Copy
    The Copy prompt dialog box as shown in the following screenshot is displayed:
    25_2_Copy.png
  3. Select a skill and click Copy.

To edit a prompt version

  1. Select the radio button next to the appropriate version.
  2. Click Edit. 
  3. Click Save.

While editing a version, you cannot edit the prompt name, type, and version number. 


To delete a prompt version

You can delete a specific version of the prompt. Deleting a version of a prompt removes it from all skills or applications where it’s used.

  1. Select the radio button next to the appropriate version.
  2. Click Delete.
    A confirmation dialog box as shown in the following screenshot appears to finalize the deletion:
    25_2_Del_version.png
  3. Click Yes.

To delete a prompt with all the versions

You can delete a complete prompt, which removes all versions associated with it.

  1. Click the Actions list next to the prompt you want to delete.
  2. Click Delete.
    A confirmation dialog box is displayed.
  3. Click Yes.

To link or unlink a prompt

You can reuse a prompt from a different skill in a new skill by linking it. In addition, if you do not want a prompt associated with a skill, you can unlink it. Unlinking helps to remove a prompt from a skill without deleting the prompt completely. Consider the following points when linking or unlinking a prompt:

  • You cannot link or unlink a prompt from an out-of-the-box skill.
  • When you unlink a service prompt from a skill, the entry is deleted only if the active router prompt is set as the starter prompt for that skill.

Before you begin

In HelixGPT  Manager, select the application and the skill from which you want to link or unlink a prompt.

  1. On the Prompts tab, click Link prompt.
    The Link prompt dialog box as shown in the following screenshot is displayed:
    25_2_Link.png
  2. Select the prompt from the list that you want to link in your skill.
    You can also directly type the name of the prompt or application in the Search box to filter the long list and select the prompt you want to link.
  3. Click Link.

Perform the following steps to unlink the prompt:

  1. On the Prompts tab, click the Action button for the prompt you want to unlink.
  2. Click Unlink.
    The following screenshot shows the option to unlink a prompt:
    25_2_Unlink.png

To change the model for a prompt

By default, the prompts added to the skill leverage the model selected while creating the skill. As an administrator, you can change the model for individual prompts in a skill to improve their relevance and  response time. 

  1. In the HelixGPT Manager, select the application and the skill.
  2. Go to the Prompts tab and Edit the prompt for which you want to change the model.
    The Edit prompt dialog box in displayed:
    25_2_EditModel.png
  3. In the Promot version editor, in the Model section, disable the radio button to select the different Provider and Model.
    By default, the radio button is enabled to use the same provider and model for skill and prompt.
  4. Select the Provider and Model from the respective drop-down lists.
  5. Click Save.

To add multiple Response prompts for a skill

To configure multiple response prompts, you must modify the Router prompt. Make the following changes when updating the Router prompt:

  1. The classificationType key must have the value response.
  2. The nextPromptType and serviceName keys must have the <response prompt name> as a value.

Example of translating a query in the Spanish language:

8. If the user input text is a query about translating to Spanishthen classify the input text as 'live chat' in the classification field of the result JSON.  
The JSON format should be
 {{"classificationType": "response", "nextPromptType": "Spanish Response", "services":
[{{  "serviceName": "Spanish Response", "confidenceScore": "1.0", "nextPromptType": "Spanish Response" }} ],       "userInputText": "...." }}
9.

To configure options for the Response prompt

To configure options for the Response prompt, add a custom prompt of type Response to a custom skill and modify the code to include multiple options. See the example below:

Example of Response prompt to configure multiple options:

Return the response in the following format and ignore the input.
You must return these options: "example1", "example2", "example3"
input - {input}
Do not include any explanations,
only provide a RFC8259 compliant JSON response following this format without deviation:
{{"output": "Hello",
   "options": ["example1", "example2", "example3"]
}} 

Consider the following when configuring options:

  • If the Response prompt is not the starter prompt, you must add an instruction in the Router prompt.
  • The nextPromptType and serviceName keys must have the <response prompt name> as a value.

To obtain JSON from the Response prompt

To obtain a JSON string from the Response prompt, enclose the code block within double curly brackets. See the example below.

Example of obtaining a JSON string from the Response prompt:

Here is an example input -
{input}
Give a 2-line explanation about the following response and you must return it with your explanation.
The response must be formatted in a Markdown `json` code block.
"{{  "name": "name",  "scope": "scope",  "guid": "guid",  "overlayGroupId": "overlayGroupId",  "type": "type",  "layout": "{{"outlets":[{{"name":"name","columns":[{{"children":["children"]}}]}}]}}",  "componentDefinitions":
[    {{      "resourceType": "resourceType",      "guid": "guid",      "type": "type",      "propertiesByName": {{        "availableOnDevices": "[\"desktop\",\"tablet\",\"mobile\"]",        "recordDefinitionName": "recordDefinitionName",        "enableFiltering": "enableFiltering",        "enableRowSelection": "enableRowSelection",        "enableFilterPresets": "enableFilterPresets"      }}}}]}}"


Related topics

Creating and managing skills

Generating prompts automatically from catalog services

 

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*