Tracing LLM calls with LangSmith


Use LangSmith to debug and monitor the LLM calls that are received in BMC HelixGPT. LangSmith provides better visibility in the LLM calls and detects errors. To use LangSmith, you must have an account with LangSmith and an API key. The API key is used to log traces and run evaluations. You must use the LangSmith API key in the BMC HelixGPT Manager to set the value of the LangSmithAPIKey parameter. 

Related topics

Before you begin

  1. Sign up for a personal or service account for LangSmith.
  2. Generate the personal or service key.

To configure the LangSmith API Key for BMC HelixGPT

  1. Log in to BMC Helix Innovation Studio.
  2. Open HelixGPT Manager.
  3. On the Records tab, select GlobalSetting, and click Edit data.
  4. Select the LangSmithAPIKey record, and click Edit.
  5. Specify the LangSmith API that you generated in the Value field, and click Save.

To enable call tracing for a skill in BMC HelixGPT

  1. Log in to BMC Helix Innovation Studio.
  2. Open HelixGPT Manager.
  3. On the Records tab, select Skill, and click Edit data.
  4. In the data editor, select the skill for which you want to trace calls, and click Edit.
  5. Specify the following attributes in the Configuration field:
    • traceOn: Set to true to enable call tracing.
    • langSmithProjectName: Specify the name project name.
      The following code is an example of the attributes configured for LangSmith:

      {
         "traceOn": true,
         "langSmithProjectName": "DemoDebug"
      }

  6. Click Save.

Results

Data for each LangSmith project is visible on the LangSmith page. You can view the input query, the output sent by BMC HelixGPT, and any errors that occurred during the responses. You can analyze the errors and take appropriate actions to resolve them. The following image shows a sample of the LangSmith page:

23303_LangSmith.png


 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*