Tracing LLM calls with LangSmith
Before you begin
- Sign up for a personal or service account for LangSmith.
- Generate the personal or service key.
To configure the LangSmith API Key for BMC HelixGPT
- Log in to BMC Helix Innovation Studio and open HelixGPT Agent Studio.
- On the Records tab, select GlobalSetting, and click Edit data.
- Select the LangSmithAPIKey record, and click Edit.
- Specify the LangSmith API that you generated in the Value field, and click Save.
To enable call tracing for a skill in BMC HelixGPT
- Log in to BMC Helix Innovation Studio.
- From the Application launcher
, select Failed to execute the [velocity] macro.
. - On the Skills tab, select the application and skill.
- Click the Configure tab.
Set the traceOn parameter to true and specify the project name in the langSmithProjectName parameter.
The following code is an example of the parameters configured for LangSmith:
{
"traceOn": true,
"langSmithProjectName": "DemoDebug"
}- Click Apply.
Results
Data for each LangSmith project is visible on the LangSmith page. You can view the input query, the output sent by BMC HelixGPT, and any errors that occurred during the responses. You can analyze the errors and take appropriate actions to resolve them. The following image shows a sample of the LangSmith page:

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*