Minor changes are by default collapsed in the page history.
No changes
The page does not exist yet.
Failed to load changes
Version by on
Leave Collaboration
Are you sure you want to leave the realtime collaboration and continue editing alone? The changes you save while editing alone will lead to merge conflicts with the changes auto-saved by the realtime editing session.
Deriving insights from logs
Use the Explorer page in BMC Helix Log Analytics to analyze logs and get to the root cause of the issue that you are troubleshooting.
Collect logs by using collection policies and analyze them from the Explorer page. Get to the root cause of an issue by using out-of-the-box options such as queries, time range, and fields.
BMC Helix Log Analyticsuses the OpenSearch platform and OpenSearch Dashboards for processing and analyzing logs. You can analyze logs graphically or by focusing on specific fields in them from the Explorer > Discover tab.
You can perform the following actions on the Discover tab:
Analyze logs
Search for and filter logs
View the index pattern
On the Discover tab, you can view log records according to your permissions. Administrators can restrict access to log records by using collection policies to restrict access to log records. If there are anomalous logs, the log records also display the log severity of the log anomaly.
(Only for users who have migrated to ClickHouse from OpenSearch) Maintain access to historical OpenSearch logs
After migrating to ClickHouse, you can still access and review historical logs stored in OpenSearch, ensuring continuity in log analysis without data loss or the need for ingesting logs again. Click Explorer (Pre-Uprgade) to view the OpenSearch logs.
You cannot see Explorer (Pre-Uprgade) if you have not migrated to ClickHouse.
The following video (2:53) illustrates how to analyze and visualize logs:
Log analysis involves searching, analyzing, and visualizing machine data generated by your IT systems and technology infrastructure to gain operational insights. Traditional data analytics tools are simply not built to handle the variety and volume of rapidly proliferating machine data.
In the Explorer > Discover tab, use the log messages to analyze logs.
After logs are collected through collection policies, they appear as log messages on the Discover tab. You can analyze these messages by performing the following actions:
Searching for relevant logs Use the search and filter options to locate relevant logs. You can refine the search results to perform analysis on the right logs.
Extracting fields Extract fields from the log messages to effectively search for relevant log messages.
Enriching logs Add meaningful information to log messages so that operators can resolve issues faster. For example, add host details such as the location of the host.
Detecting anomalies Anomalies are rare patterns or abnormalities that indicate a deviation from the normal behavior of system performance. BMC Helix Log Analytics provides automated analysis with machine learning (ML)-based anomaly detection of abnormal or rare log patterns. You can analyze anomalous logs to debug application errors and ensure optimum performance. You can proactively find concerns or errors before they become a problem.
Searching for and filtering logs
Use the Discover tab to search for specific log messages and filter them.
Searching for alphanumeric strings in logs
On the Discover tab, use the following methods to search for a specific alphanumeric string:
In the Search field, enter the string that you are looking for in a log field. The format is: field_name:"search string". For example, to search for all logs that reported status 501 is reported, enter status:501.
Click Add Filter and select a field. Operators are available according to the data type of the field that you selected. Enter the string and save the filter. For example, loglevel.keyword is error.
Warning
Important
To obtain the search results, search with complete keywords. If you search with partial keywords, search results are not displayed. For example, if you are searching for the Apache Logs bmc_tag, add the following search criteria: bmc_tag contains Apache Logs.
The Search field is case-sensitive. To obtain the search results, make sure that you search with the right character case.
Search results for sample logs for different queries
Sample logs:
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Tenant configurations Initialization done. Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert Thread-KafkaConsumer - Initializing kafka consumer... Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
Query
Description
Search result
message:Thread-MainThread
All records where the message field contains "Thread-MainThread".
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Tenant configurations Initialization done. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
message:Thread
All records where the message field contains "Thread".
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Tenant configurations Initialization done. Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert Thread-KafkaConsumer - Initializing kafka consumer... Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
message:Thread*
All records where the message field contains the word Thread followed by other characters (excluding space).
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Tenant configurations Initialization done. Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert Thread-KafkaConsumer - Initializing kafka consumer... Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
message:Starting kafka consumer
All records where the message field contains any of the following words: Starting, Kafka, or consumer. Here, space is treated as the OR operator in the search.
Thread-MainThread - Consumed CPU utilization exceeded threshold.
message:"Starting kafka consumer"
All records where the message field contains the complete string "Starting Kafka consumer".
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.
message:”Starting kafka con**”
All records where the message field contains the complete string "Starting kafka con*". Here, the * character is not treated as a wildcard character.
No results.
message:*ProcessStart=The AND message:process AND message:started*
All records where the message field contains all the following strings in a log message:
*ProcessStart=The
process
started*
Here, you are filtering log messages that contain all the strings that you mention in the query by using the AND operator.
Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Filtering search results by time range and date
Use the following options to filter logs according to the time range and date:
Specify days or hours for which you want to search results. For example, search results for last 15 minutes or last 7 days.
Set specific date and time (absolute or specific). For example, search results for Jul 18, 2022 18:00 hours till Jul 19, 2022 18:00 hours.
Supported time formats
The log generation time is saved in the @timestamp field. The time of the collected logs must be in the ISO 8601 ZULU format, for example, 2022-02-20T12:21:32.756Z. If the log generation time is specified in any other format, it is saved in the @@timestamp field, and the log collection time is saved in the @timestamp field. The log collection time is available in the Greenwich Mean Time (GMT) time zone.
If you are collecting logs by using external agents such as Logstash and Filebeat, the Epoch time format is supported. However, if you are collecting logs by using the Docker, Windows, or Linux connector, the Epoch time format is not supported.
Filtering search results based on fields
You can filter logs based on their fields. The fields identified in logs are displayed in the Available fields section. Click a field and select a value to filter logs based on the field.
Example If you want to filter all logs that have a specific IP Address, perform the following steps:
In the Available fields section, click the ipAddress field.
Select an IP Address and click Filter.
Success
Tip
If you see the ? sign instead of the data type icon of a field, refresh the index on the Stack Management > Index pattern > Index pattern name page. Or, wait for around 5 minutes when the index is refreshed automatically and then refresh your browser.Type your success message here.
Saving searches
Save the search query that you have created by using the search field, available fields, and time period fields. In future, access the saved search to get similar results.
Click Save.
Enter a name.
To access the saved search, click Open.
Overview of index patterns
An index pattern refers to the configuration that defines how logs and data are ingested, organized, and indexed within the system efficient log analysis. A default index pattern is already created in BMC Helix Log Analytics. All logs are collected under this index pattern. You can neither delete it nor create a new pattern.
By default, the rollover duration of the default index pattern is 3 days. Therefore, you can see data from the last 3 days on the Explorer page.
The following index patterns are also available:
logarc_*
logml-*
The logarc_* index pattern
After you enable log archiving, a new index pattern is added to the Discover page in the format logarc_*. All the logs collected since the time archiving is enabled for your tenant are shown in the new index pattern. The data before archiving is enabled continues to show in the earlier index pattern. The archived and restored data is available only in the new index pattern. Therefore, to analyze logs that are collected after archiving is enabled, use the logarc_* index pattern.
The logml-* index pattern
After an anomaly or rare pattern is detected in logs, it is reported in a new index pattern whose format is logml-*.
Learn more
Read the following blog to learn how logs help you understand the health of your environment, identify issues, and track their root cause Observability with logs to accelerate MTTR.
Use the Explorer page in BMC Helix Log Analytics to analyze logs and get to the root cause of the issue that you are troubleshooting.
Collect logs by using collection policies and analyze them from the Explorer page. Get to the root cause of an issue by using out-of-the-box options such as queries, time range, and fields.
BMC Helix Log Analyticsuses the OpenSearch platform and OpenSearch Dashboards for processing and analyzing logs. You can analyze logs graphically or by focusing on specific fields in them from the Explorer > Discover tab.
On the Discover tab, you can view log records according to your permissions. Administrators can restrict access to log records by using collection policies to restrict access to log records. If there are anomalous logs, the log records also display the log severity of the log anomaly.
The following video (2:53) illustrates how to analyze and visualize logs:
In BMC Helix Log Analytics, use the Explorer > Discover tab to derive insights from logs. The following figure displays the Discover tab on the Explorer page:
You can perform the following actions on the Discover tab:
Analyze logs
Search for and filter logs
View the index pattern
Analyzing logs
Log analysis involves searching, analyzing, and visualizing machine data generated by your IT systems and technology infrastructure to gain operational insights. Traditional data analytics tools are simply not built to handle the variety and volume of rapidly proliferating machine data.
In the Explorer > Discover tab, use the log messages to analyze logs.
After logs are collected through collection policies, they appear as log messages on the Discover tab. You can analyze these messages by performing the following actions:
Searching for relevant logs Use the search and filter options to locate relevant logs. You can refine the search results to perform analysis on the right logs.
Extracting fields Extract fields from the log messages to effectively search for relevant log messages.
Enriching logs Add meaningful information to log messages so that operators can resolve issues faster. For example, add host details such as the location of the host.
Detecting anomalies Anomalies are rare patterns or abnormalities that indicate a deviation from the normal behavior of system performance. BMC Helix Log Analytics provides automated analysis with machine learning (ML)-based anomaly detection of abnormal or rare log patterns. You can analyze anomalous logs to debug application errors and ensure optimum performance. You can proactively find concerns or errors before they become a problem.
Searching for and filtering logs
Use the Discover tab to search for specific log messages and filter them.
Searching for alphanumeric strings in logs
On the Discover tab, use the following methods to search for a specific alphanumeric string:
In the Search field, enter the string that you are looking for in a log field. The format is: field_name:"search string". For example, to search for all logs that reported status 501 is reported, enter status:501.
Click Add Filter and select a field. Operators are available according to the data type of the field that you selected. Enter the string and save the filter. For example, loglevel.keyword is error.
Warning
Important
To obtain the search results, search with complete keywords. If you search with partial keywords, search results are not displayed. For example, if you are searching for the Apache Logs bmc_tag, add the following search criteria: bmc_tag contains Apache Logs.
The Search field is case-sensitive. To obtain the search results, make sure that you search with the right character case.
Search results for sample logs for different queries
Sample logs:
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Tenant configurations Initialization done. Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert Thread-KafkaConsumer - Initializing kafka consumer... Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
Query
Description
Search result
message:Thread-MainThread
All records where the message field contains "Thread-MainThread".
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Tenant configurations Initialization done. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
message:Thread
All records where the message field contains "Thread".
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Tenant configurations Initialization done. Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert Thread-KafkaConsumer - Initializing kafka consumer... Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
message:Thread*
All records where the message field contains the word Thread followed by other characters (excluding space).
Thread-MainThread - Starting log processing service... Thread-MainThread - Initializing tenant configurations Thread-MainThread - Fetch properties for tenant=bmc Thread-MainThread - Initialized tenant configurations for tenant=bmc Thread-MainThread - Fetch properties for tenant=hp Thread-MainThread - Initialized tenant configurations for tenant=hp Thread-MainThread - Consumed CPU utilization exceeded threshold. Thread-MainThread - Tenant configurations Initialization done. Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert Thread-KafkaConsumer - Initializing kafka consumer... Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started. Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started. Thread-MainThread - [ProcessStart=The job process of log processing service has finished. Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.
message:Starting kafka consumer
All records where the message field contains any of the following words: Starting, Kafka, or consumer. Here, space is treated as the OR operator in the search.
Thread-MainThread - Consumed CPU utilization exceeded threshold.
message:"Starting kafka consumer"
All records where the message field contains the complete string "Starting Kafka consumer".
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.
message:”Starting kafka con**”
All records where the message field contains the complete string "Starting kafka con*". Here, the * character is not treated as a wildcard character.
No results.
message:*ProcessStart=The AND message:process AND message:started*
All records where the message field contains all the following strings in a log message:
*ProcessStart=The
process
started*
Here, you are filtering log messages that contain all the strings that you mention in the query by using the AND operator.
Thread-MainThread - [ProcessStart=The process of log processing service started. Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Filtering search results by time range and date
Use the following options to filter logs according to the time range and date:
Specify days or hours for which you want to search results. For example, search results for last 15 minutes or last 7 days.
Set specific date and time (absolute or specific). For example, search results for Jul 18, 2022 18:00 hours till Jul 19, 2022 18:00 hours.
Supported time formats
The log generation time is saved in the @timestamp field. The time of the collected logs must be in the ISO 8601 ZULU format, for example, 2022-02-20T12:21:32.756Z. If the log generation time is specified in any other format, it is saved in the @@timestamp field, and the log collection time is saved in the @timestamp field. The log collection time is available in the Greenwich Mean Time (GMT) time zone.
If you are collecting logs by using external agents such as Logstash and Filebeat, the Epoch time format is supported. However, if you are collecting logs by using the Docker, Windows, or Linux connector, the Epoch time format is not supported.
Filtering search results based on fields
You can filter logs based on their fields. The fields identified in logs are displayed in the Available fields section. Click a field and select a value to filter logs based on the field.
Example
If you want to filter all logs that have a specific IP Address, perform the following steps:
In the Available fields section, click the ipAddress field.
Select an IP Address and click Filter.
Success
Tip
If you see the ? sign instead of the data type icon of a field, refresh the index on the Stack Management > Index pattern > Index pattern name page. Or, wait for around 5 minutes when the index is refreshed automatically and then refresh your browser.
Saving searches
Save the search query that you have created by using the search field, available fields, and time period fields. In future, access the saved search to get similar results.
Click Save.
Enter a name.
To access the saved search, click Open.
Overview of index patterns
An index pattern refers to the configuration that defines how logs and data are ingested, organized, and indexed within the system efficient log analysis. A default index pattern is already created in BMC Helix Log Analytics. All logs are collected under this index pattern. You can neither delete it nor create a new pattern.
By default, the rollover duration of the default index pattern is 3 days. Therefore, you can see data from the last 3 days on the Explorer page.
The following index patterns are also available:
logarc_*
logml-*
The logarc_* index pattern
After you enable log archiving, a new index pattern is added to the Discover page in the format logarc_*. All the logs collected since the time archiving is enabled for your tenant are shown in the new index pattern. The data before archiving is enabled continues to show in the earlier index pattern. The archived and restored data is available only in the new index pattern. Therefore, to analyze logs that are collected after archiving is enabled, use the logarc_* index pattern.
The logml-* index pattern
After an anomaly or rare pattern is detected in logs, it is reported in a new index pattern whose format is logml-*.
Learn more
Read the following blog to learn how logs help you understand the health of your environment, identify issues, and track their root cause Observability with logs to accelerate MTTR.