This documentation supports the 25.3 version of BMC Helix Digital Workplace Basic and BMC Helix Digital Workplace Advanced. Icons distinguish capabilities available only for the Advanced and External license levels. For more information, see License types and features. To view an earlier version, select the version from the Product version menu.

 

Configuring BMC HelixGPT to offer options for unanswered questions


BMC HelixGPT enhances service efficiency by providing end users with the following options when it is unable to provide relevant answers for knowledge articles-related questions:

  • Submit a service request: BMC Helix Digital Workplace administrators can configure the Fallback prompt to seamlessly route the unanswered questions into actionable requests.
  • (Controlled availability) Connect to a live agent: In addition to the fallback prompt available out of the box, administrators can customize the live chat prompt to enable live chat support. Users can connect with a live agent directly from BMC HelixGPT chat when BMC HelixGPT cannot generate relevant answers to their query. Additionally, users can choose to bypass the fallback flow and directly connect with a live agent at any time. When a chat session is escalated, the live agent receives a summary of the previous chat with BMC HelixGPT, enabling faster and more contextual support.

Note the following considerations when using the live chat capability:

  • Localization support is not available.
  • Attachments and links to knowledge articles cannot be shared between agents and users.
  • This capability is not supported in channels, such as Microsoft Teams.

 

Using the Fallback prompt

When BMC Helix Digital Workplace Catalog administrators use the Fallback prompt available out-of-the-box, they must specify a service that enables end users to submit a service request. By using the Publish Chat-enabled Service wizard, BMC Helix Digital Workplace Catalog administrators can either import an existing BMC Helix Digital Workplaceservice or download and import the sample service provided by BMC Software.

Click here to download the sample service.

When specifying a default service, BMC Helix Digital Workplace Catalog administrators must consider the following points:

  • Use a BMC Helix Digital Workplace Catalog native single-request service.
  • Do not localize the service or questions asked in the service.
  • Map only one service as a default service.
  • Use only the text field and text area (without RTF)  type questions in the service.
  • Add a maximum of three questions in one service request. 

To use the Fallback prompt available out-of-the-box, BMC Helix Digital Workplace administrators perform the following tasks:

  1. Use HelixGPT Manager to copy the Fallback prompt
  2. Use the Publish Chat-enabled Service wizard to specify the default service request
  3. Use HelixGPT manager to configure the live chat capability

 

 

Task 1: To use the fallback prompt

Step

Action 

Reference in BMC HelixGPT documentation

Example

1

Log in to HelixGPT Agent Studio and create a custom skill.

Click here to see the example
You are an intelligent virtual assistant and you need to decide whether the input text is an information request.
This is a classification task that you are being asked to predict between the classes: information or tools requests.
Returned response should always be in JSON format specified below for both classes.
Do not include any explanations, only provide a RFC8259 compliant JSON response following this format without deviation:
   {{
       "classificationType": "information service",
       "nextPromptType": "Knowledge",
       "services": [
            {{
               "serviceName": "Dummy",
               "confidenceScore": "1.0",
               "nextPromptType": "Knowledge"
            }}
        ],
       "userInputText": "...."
    }}


Ensure these guidelines are met.

0. If there are multiple possible matches for a user request, please ask the user to disambiguate and clarify which
match is preferred.


1. If user input text is a question that contains phrases such as "How" or "Why", "How to", "How do" etc. then classify the
input text as 'information request' in the classification field of the result JSON.  The JSON format should be:
   {{
       "classificationType": "information service",
       "nextPromptType": "Knowledge",
       "services": [
            {{
               "serviceName": "Dummy",
               "confidenceScore": "1.0",
               "nextPromptType": "Knowledge"
            }}
        ],
       "userInputText": "...."
    }}
   In case the classification type is "information service" then don't change the attribute value for 'nextPromptType' in the JSON.


2.  The list of catalog services is shown below along with the corresponding prompts.

Use only this list.

List of catalog services and corresponding prompt types are:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Input Text: Sample input text1
Input Text: Sample input text2
   Service Name: Sample Service, Prompt Type: Sample Prompt Type
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Input Text: General Request
Input Text: Raise a service request
   Service Name: General Request, Prompt Name: General Request
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

3. If there are multiple catalog services that match the input text, then show the catalog services and sort them by highest confidence.
Set the "services" field in the result JSON.  'text' field should have the input text.  Output JSON:
   {{
       "classificationType": "catalog service",
       "nextPromptType": "Service",
       "services": [
                        {{
                           "serviceName": "service name 1",
                           "confidenceScore": highest confidence score,
                           "nextPromptType": "prompt type 1"
                        }},
                                                {{
                           "serviceName": "service name 2",
                           "confidenceScore": second highest confidence score,
                           "nextPromptType": "prompt type 2"
                        }},
                    ],
       "userInputText": "...."
    }}


4. When your confidence on matching to a single catalog service is very high, classify the input text as 'catalog service' and show the matching service and ask the user for
confirmation of the service picked.
Once a single service is selected, set the "services" field in result JSON to this selected service.
'text' field should have the input text.  Output JSON:
   {{
       "classificationType": "catalog service",
       "nextPromptType": "Service",
       "services": [
                        {{
                           "serviceName": "service name",
                           "confidenceScore": confidence score,
                           "nextPromptType": "prompt type"
                        }}
                    ],
       "userInputText": "...."
    }}


5.  If the user input text is a query about
   a. a request or a service request,
   b. a list of requests or a list of service requests
   c. an appointment or a list of appointments
   d. a task or a list of tasks,
   e. a to-do or a list of to-dos
   f. what is the status of request REQXXXX
   g. what is the details of request REQXXXX
   h. summarize requests
   i. an existing request
   j. contains a string like REQXXXX
   k. what is the status of request XXXX
   l. what is the details of request XXXX
   m. contains a string like XXXX
then classify the input text as 'requests' in the classification field of the result JSON.  The JSON format should be
   {{
      "classificationType": "requests",
      "nextPromptType": "Request",
      "services": [
          {{
            "serviceName": "Dummy",
            "confidenceScore": "1.0",
            "nextPromptType": "Request"
          }}
       ],
      "userInputText": "...."
    }}


6. If the user input text asks for information or guidance, such as "How do I" or "Can you help," classify it as an 'information request' in the classification field of the result JSON. For example, if the user is asking for help or clarification on a process, it should be classified as an information request.
7. Based on the classification, if the request is for request, set 'classification' in JSON to 'requests'.
8. Based on the classification, if the request is for catalog services, set 'classification' in JSON to 'catalog service'.
9. If the user input text does not match with any service, you MUST set nextPromptType to Knowledge.
10. Return the response in JSON format only without any explanations.  You must ensure that you return a valid JSON response.

{input}

2

In the custom skill, copy the BMC Helix Digital Workplace Router prompt. 
This must be the starter prompt.

3

To the same custom skill, copy the knowledge prompt.

4

To the same custom skill, copy the Fallback prompt available out-of-the-box.

 

Task 2: To specify the default service request submitted by end users

  1. Log in to BMC Helix Innovation Studio and navigate to the Workspace tab. 
  2. Select Actions > Publish Chat-enabled Service.
    The Publish Chat-enabled Service wizard is displayed.
  3. On the Workspace tab, perform the following steps:
    1. From the Provider drop-down menu, select BMC HelixGPT.
    2. From the Application Name drop-down menu, select BMC Digital Workplace.
    3. From the Skill drop-down menu, select the appropriate skill.
    4. Click Next.
      The following screenshot shows the options on the Workspace tab:
      23_3_04_DWP_Wizard1_New.png
  4. On the Service to Chat-enable tab, select a service that you want to include.
    The following screenshot shows the options on the Service to Chat-enable tab:
    23_3_04_DWP_Wizard2_New.png
  1. On the User Request tab, perform the following steps:
    1. Click the Add button next to the Request Variation field.
      A blank field is displayed after the Request Variation field.
    2. In the blank field, add the following text to enable raising a service request through BMC HelixGPT:
      Raise a service request
    3. Click Next.
      The following screenshot shows the options on the User Request tab:
      23_3_04_DWP_Wizard3_new.png
  2. On the Questions tab, click Next.
    The following screenshot shows the Questions tab:
    23_3_04_DWP_Wizard4_New.png
  3. On the Publication tab, click the Publish button.
    The following screenshot shows the Publication tab:
    23_3_04_DWP_Wizard5_New.png
    The following confirmation message is displayed:
    23_3_04_DWP_Confirmation Message.png
  4. Click Confirm.
  5. Click Close.

The default service is defined.

 

Task 3: To configure the option to connect to a live agent

Step

Action 

Reference in BMC HelixGPT documentation

Example

1

In the Fallback prompt, in the No Result Found category, add an option to connect with a live agent.

Click here to see the example
Fallback Options by Categories:

  - No Results Found:
        - Raise a service request: Raise a service request
          - Chat with a Live Agent: Chat with a Live Agent
  - Failed Service Request:
        - Raise a service request: Raise a service request
       - Failed Router Classification:
        - Raise a service request: Raise a service request
       - System Error:
        - Raise a service request: Raise a service request
  - Other:
        - Raise a service request: Raise a service request

2

Modify the Router prompt to add Live Chat classification type.

Click here to see the example

  

   If the user input text is a query about

       a. connect to an agent

       b. want to talk to agent

       c. chat with live agent

       d. live agent

       e. agent

then classify the input text as 'live chat' in the classification field of the result JSON.  The JSON format should be

   {{

      "classificationType": "live chat",

      "nextPromptType": "Live Chat",

      "services": [

          {{

            "serviceName": "LiveChatService",

            "confidenceScore": "1.0",

            "nextPromptType": "Live Chat"

          }}

       ],

      "userInputText": "...."

    }}



3Link your Live Chat prompt to the custom skill that you are currently using.Live Chat prompt 

 

Result: Output after using the Fallback prompt

The following screenshot shows the option to raise a service request: 

1743586919820-338.png

The following screenshot shows the option to chat with a live agent:


1743586444488-180.png

The following screenshot shows how the BMC HelixGPT chat summary and conversation history are shared with the live agent:

1750753557254-554.png

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*