Updating the configuration parameters of a skill


 

After importing a skill, you can configure its parameters to align with your specific needs. The default values are used if you don't update the values.

 To update the configuration parameters of a skill

  1. Log in to BMC Helix Innovation Studio.
  2. Select Workspace > HelixGPT Manager.
  3. Select the Skill record definition, and click Edit data.
  4. On the Data editor (Skill) page, select a skill, and click Edit.
  5. In the Edit record pane, in the Configuration field, view and update the values for the following parameters: 

    Configuration parameter

    Description

    Default value

    conversationBufferMemory

    Maintains a record of recent conversations, which is used whenever the assistant makes an LLM call.

    Example:

    Chatbot: Weather today
    AI: 78 Fahrenheit
     

    30

    DataConnection_StripUrlsFromSource

    Use the DataConnection_StripUrlsFromSource parameter to control whether URLs are removed from text data before ingestion into OpenSearch and embedding calculations.

    When you set the parameter to True:

    • Strips URLs from the text assigned to the text_field in the OpenSearch index so that URLs are not stored in the searchable text field. This helps with data consistency and privacy.
    • It removes URLs from the input text used for embedding calculations, improving embedding quality by removing unnecessary link information.

     

    degreeOfDiversity

    Defines the diversity of results to be returned by MMR; 1 for minimum diversity and 0 for maximum.

     

    0.5

    disableDigression

    Use this parameter to make sure that the generated response stays focused on the given topic.

    When this parameter is set to true, the digression text is excluded from the service prompt.

    However, digressive text is not included if a knowledge prompt is not used in a skill.

    Digression text in the service prompt

    Digressions occur only when a user asks a question. For all other cases, digressions are not true.
      Only if the user's response is a question that starts with "How" or "Why", you must respond with the following JSON, by replacing the "inputUserText" atribute.
      Don't change other attributes.


        {{
          "digression" : "TRUE",
          "nextPromptType" : "Knowledge",
          "inputUserText": "user entered text here"
        }}

    False

    enableRephrasePrompt

    Enables or disables the option for the chatbot to automatically rephrase prompts.

    When set to true, the chatbot can rephrase user inputs to improve clarity.

    Important

    For Gemini and LlaMA models, the default value for the enableRephrasePrompt parameter is set to true. However, we recommend setting the value to false if you want to initiate a new topic. 

    True

    excludeDescriptionQuestions

    Use this parameter only when you configure a skill for BMC Digital Workplace catalog.

    When set to true, the prompt does not generate any description questions.

    False

    ignoreUserInputForKbidSearch

     Use the ignoreUserInputForKbidSearch parameter to determine if the system should use the user's input when searching for a Knowledge Base (KB) article by its ID.

    Valid values:

    • True: The system ignores the user's input and strictly performs the search based on predefined configurations or parameters for the KB article ID.
       
    • False: The system includes the user's input in the search criteria. This allows for more dynamic and contextual searches based on what the user types.

    True

    initial_chunks_to_accumulate

    You can use this parameter only when the following conditions are true:

    • The steaming mode is on.
    • The supportCitation parameter is set to true.

    This parameter helps you determine the text size that you can send in response to a skill. By default, this parameter is not included in a skill configuration. You must add it manually in the skill configuration.

    If this parameter is not specified in the skill configuration, the default value 5 is applied.

    Best practices:

    • With the Google Gemini model, we recommend keeping the value of this parameter to 2.
    • For OCI Llama 3.1, we recommend keeping the value of this parameter to 14.
    • For other models, we recommend keeping the maximum value of this parameter to 5.

     

    5

    knowledgeSearchScope

    Defines the scope of the search to answer a user's question in a knowledge Q&A use case.

    Valid values:

    • ENTERPRISE—The search will be performed on the enterprise knowledge base.
    • WORLD—The search will be performed on the enterprise knowledge base and external articles online.
       

    Important:

    When you use the Gemini or Llama 3.1 model with the ENTERPRISE option, the search scope is limited to internal documents.

    ENTERPRISE

    langSmithProjectName

    The project in LangSmith where you can find traces.

     

    lowWaterMarkThreshold

    Specifies the minimum relevance threshold for similarity_score_threshold.

    The value must be between 0 and 1.

     

    maxTokensLimit config

    The parameter defines the maximum number of tokens that can be included in the input and output of an AI model call.

    It directly impacts the amount of content (chunks) that can be passed to the model.

    4000

    numberOfDocumentsToFetchForMMR

    Specifies the maximum number of similar and diverse documents to return based on the MMR algorithm.

     

    20

    numberOfDocumentsToReturn

    Use this parameter to specify the maximum number of documents to retrieve in response to a query.

    This parameter helps control the result set size, ensuring that only the most relevant documents are returned based on the query's ranking criteria.

     

     

    numberOfNearestNeighborsToSearch

    This parameter defines the number of nearest neighbors to evaluate during a query in a vector search operation.

    This parameter determines how many closely related items will be considered for similarity ranking, influencing the accuracy and speed of the results.

    Twice the value of the numberOfDocumentsToReturn parameter

    processInitialUserText

    Processes the initial input so the user doesn't need to answer the same question again and will also refrain the language model from asking the question again if already been answered.

     

    True

    promptGenerationMaxQuestions

    Specifies the maximum number of questions that a user can ask to resolve a query.

    Consider the following points when using the promptGenerationMaxQuestions parameter:

    • If the number of questions in the prompt is equal to or less than this value, the system will generate a prompt to guide the user through the questions in Helix Virtual Agent during runtime.
    • If the number of questions exceeds this value, the system will generate a prompt displaying a link in the Helix Virtual Agent at runtime. This link will direct the user to the Digital Workplace application, where they can submit their request.
    • For large services, you can change the value of this parameter. However, the token-based model has a limit for the number of prompts.

    15

    promptGenerationWithMultiSelectQuestion

    Defines if you want to generate a prompt with a question that has multi-select options.

     

    False

    retrieveUserContextProcessName

    BMC Helix Innovation Studioprocess that is used to retrieve the user context variables.

     

    searchType

    Defines the type of search the Retriever should perform.

    Valid values:

    • similarity—returns similar documents
    • similarity_score_threshold—returns similar documents only if they meet the minimum threshold specified by lowWaterMarkThreshold
    • mmr—returns similar and diverse documents based on maximum marginal relevance

    similarity

    supportCitations

    When this parameter is true, BMC HelixGPT extracts the most relevant resource and filters out less relevant resources, so that only the most relevant resources are displayed as references in the response.

    When this parameter is false, all resources extracted by BMC HelixGPT are displayed in the response.

     

    True

    traceOn

    Generates traces for the skill.

     

Example of configuring parameters for a skill

The following code is an example of configuring parameters for a skill:

{
    "knowledgeSearchScope": "ENTERPRISE",
    "lowWaterMarkThreshold": 0.2,
    "searchType": "similarity_score_threshold",
    "numberOfDocumentsToReturn": 8,
    "numberOfDocumentsToFetchForMMR": 20,
    "degreeOfDiversity": 0.5,
    "promptGenerationMaxQuestions": 15,
    "promptGenerationWithMultiSelectQuestion": false
}

Related topics

Skills

Creating prompts and skills

Updating prompts and skills

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*