Updating the configuration parameters of a skill
To update the configuration parameters of a skill
- Log in to BMC Helix Innovation Studio.
- Select Workspace > HelixGPT Manager.
- Select the Skill record definition, and click Edit data.
- On the Data editor (Skill) page, select a skill, and click Edit.
In the Edit record pane, in the Configuration field, view and update the values for the following parameters:
Configuration parameter
Description
Default value
numberOfNearestNeighborsToSearch
This parameter defines the number of nearest neighbors to evaluate during a query in a vector search operation.
This parameter determines how many closely related items will be considered for similarity ranking, influencing the accuracy and speed of the results.
Twice the value of the numberOfDocumentsToReturn parameter
numberOfDocumentsToReturnUse this parameter to specify the maximum number of documents to retrieve in response to a query.
This parameter helps control the result set size, ensuring that only the most relevant documents are returned based on the query's ranking criteria.
??
ignoreUserInputForKbidSearch
Use the ignoreUserInputForKbidSearch parameter to determine if the system should use the user's input when searching for a Knowledge Base (KB) article by its ID.
Valid values:
- (Default) True: The system ignores the user's input and strictly performs the search based on predefined configurations or parameters for the KB article ID.
- False: The system includes the user's input in the search criteria. This allows for more dynamic and contextual searches based on what the user types.
True
DataConnection_StripUrlsFromSource
Use the DataConnection_StripUrlsFromSource parameter to control whether URLs are removed from text data before ingestion into OpenSearch and embedding calculations.
When you set the parameter to True:
- Strips URLs from the text assigned to the text_field in the OpenSearch index so that URLs are not stored in the searchable text field. This helps with data consistency and privacy.
- It removes URLs from the input text used for embedding calculations, improving embedding quality by removing unnecessary link information.
{{id name=".Updatingtheconfigurationparametersofaskillv25.1.00-disable_digression"/}}
disableDigressionUse this parameter to make sure that the generated response stays focused on the given topic.
When this parameter is set to true, the digression text is excluded from the service prompt.
However, digressive text is not included if a knowledge prompt is not used in a skill.
false
Unknown macro: confluence_id. Click on this message for details.
initial_chunks_to_accumulateYou can use this parameter only when the following conditions are true:
- The steaming mode is on.
- The supportCitation parameter is set to true.
This parameter helps you determine the text size that you can send in response to a skill. By default, this parameter is not included in a skill configuration. You must add it manually in the skill configuration.
If this parameter is not specified in the skill configuration, the default value 5 is applied.
Best practices:
- With the Google Gemini model, we recommend keeping the value of this parameter to 2.
- For OCI Llama 3.1, we recommend keeping the value of this parameter to 14.
- For other models, we recommend keeping the maximum value of this parameter to 5.
5
Unknown macro: confluence_id. Click on this message for details.
excludeDescriptionQuestionsUse this parameter only when you configure a skill for BMC Digital Workplace catalog.
When set to true, the prompt does not generate any description questions.
false
enableRephrasePrompt
Enables or disables the option for the chatbot to automatically rephrase prompts.
When set to true, the chatbot can rephrase user inputs to improve clarity.
Important:
For Gemini and LlaMA models, the default value for the enableRephrasePrompt parameter is set to true. However, we recommend setting the value to false if you want to initiate a new topic.
true
conversationBufferMemory
Maintains a record of recent conversations, which is used whenever the assistant makes an LLM call.
Example:
Chatbot: Weather today
AI: 78 Fahrenheit30
langSmithProjectName
The project in LangSmith where you can find traces.
retrieveUserContextProcessName
BMC Helix Innovation Studioprocess that is used to retrieve the user context variables.
supportCitations
When this parameter is true, BMC HelixGPT extracts the most relevant resource and filters out less relevant resources, so that only the most relevant resources are displayed as references in the response.
When this parameter is false, all resources extracted by BMC HelixGPT are displayed in the response.
true
traceOn
Generates traces for the skill.
degreeOfDiversity
Defines the diversity of results to be returned by MMR; 1 for minimum diversity and 0 for maximum.
0.5
knowledgeSearchScope
Defines the scope of the search to answer a user's question in a knowledge Q&A use case.
Valid values:
- (Default) ENTERPRISE—The search will be performed on the enterprise knowledge base.
- WORLD—The search will be performed on the enterprise knowledge base and external articles online.
Important:
The search scope is limited to internal documents when you use the Gemini or Llama 3.1 model with the ENTERPRISE option.
ENTERPRISE—The search will be performed on the enterprise knowledge base.
lowWaterMarkThreshold
Specifies the minimum relevance threshold for similarity_score_threshold.
The value must be between 0 and 1.
numberOfDocumentsToFetchForMMR
Specifies the maximum number of similar and diverse documents to return based on the MMR algorithm.
20
numberOfDocumentsToReturn
Specifies the maximum number of documents to return in response to a question.
5
processInitialUserText
Processes the initial input so the user doesn't need to answer the same question again and will also refrain the language model from asking the question again if already been answered.
true
promptGenerationMaxQuestions
Specifies the maximum number of questions that a user can ask to resolve a query.
Consider the following points when using the promptGenerationMaxQuestions parameter:
- If the number of questions in the prompt is equal to or less than this value, the system will generate a prompt to guide the user through the questions in Helix Virtual Agent during runtime.
- If the number of questions exceeds this value, the system will generate a prompt displaying a link in the Helix Virtual Agent at runtime. This link will direct the user to the Digital Workplace application, where they can submit their request.
- For large services, you can change the value of this parameter. However, the token-based model has a limit for the number of prompts.
15
promptGenerationWithMultiSelectQuestion
Defines if you want to generate a prompt with a question that has multi-select options.
false
searchType
Defines the type of search the Retriever should perform.
Valid values:
- (Default) similarity—returns similar documents
- similarity_score_threshold—returns similar documents only if they meet the minimum threshold specified by lowWaterMarkThreshold
- mmr—returns similar and diverse documents based on maximum marginal relevance
similarity—returns similar documents
- (Default) True: The system ignores the user's input and strictly performs the search based on predefined configurations or parameters for the KB article ID.
Example of configuring parameters for a skill
The following code is an example of configuring parameters for a skill:
"knowledgeSearchScope": "ENTERPRISE",
"lowWaterMarkThreshold": 0.2,
"searchType": "similarity_score_threshold",
"numberOfDocumentsToReturn": 8,
"numberOfDocumentsToFetchForMMR": 20,
"degreeOfDiversity": 0.5,
"promptGenerationMaxQuestions": 15,
"promptGenerationWithMultiSelectQuestion": false
}