Deriving insights from logs


Use the Explorer page in BMC Helix Log Analytics to analyze logs and get to the root cause of the issue that you are troubleshooting. 

Collect logs by using collection policies and analyze them from the Explorer page. Get to the root cause of an issue by using out-of-the-box options such as queries, time range, and fields.

BMC Helix Log Analyticsuses the OpenSearch platform and OpenSearch Dashboards for processing and analyzing logs. You can analyze logs graphically or by focusing on specific fields in them from the Explorer > Discover tab. You can also export or download logs as CSV files for a deeper analysis and insight.

On the Discover tab, you can view log records according to your permissions. Administrators can restrict access to log records by using collection policies to restrict access to log records. If there are anomalous logs, the log records also display the log severity of the log anomaly. 

The following video (2:53) illustrates how to analyze and visualize logs:

icon-play@2x.pnghttps://www.youtube.com/watch?v=fggAxALVs0w

Overview of the Discover tab

In BMC Helix Log Analytics, use the Explorer > Discover tab to derive insights from logs. The following figure displays the Discover tab on the Explorer page:

explorer_page.png

You can perform the following actions on the Discover tab:

  • Analyze logs
  • Search for and filter logs
  • Export logs
  • View the index pattern 

Analyzing logs

Log analysis involves searching, analyzing, and visualizing machine data generated by your IT systems and technology infrastructure to gain operational insights. Traditional data analytics tools are simply not built to handle the variety and volume of rapidly proliferating machine data.

In the Explorer > Discover tab, use the log messages to analyze logs.

explorer_logs.png

After logs are collected through collection policies, they appear as log messages on the Discover tab. You can analyze these messages by performing the following actions:

  • Searching for relevant logs
    Use the search and filter options to locate relevant logs. You can refine the search results to perform analysis on the right logs.
  • Extracting fields
    Extract fields from the log messages to effectively search for relevant log messages.
  • Enriching logs
    Add meaningful information to log messages so that operators can resolve issues faster. For example, add host details such as the location of the host.
  • Detecting anomalies
    Anomalies are rare patterns or abnormalities that indicate a deviation from the normal behavior of system performance. BMC Helix Log Analytics provides automated analysis with machine learning (ML)-based anomaly detection of abnormal or rare log patterns. You can analyze anomalous logs to debug application errors and ensure optimum performance. You can proactively find concerns or errors before they become a problem.

Searching for and filtering logs

Use the Discover tab to search for specific log messages and filter them.

explorer_add_filter.png

Searching for alphanumeric strings in logs

On the Discover tab, use the following methods to search for a specific alphanumeric string:

  • In the Search field, enter the string that you are looking for in a log field. The format is: field_name:"search string".
    For example, to search for all logs that reported status 501 is reported, enter status:501.
  • Click Add Filter and select a field. Operators are available according to the data type of the field that you selected. Enter the string and save the filter.
    For example, loglevel.keyword is error.

Important

  • To obtain the search results, search with complete keywords. If you search with partial keywords, search results are not displayed. For example, if you are searching for the Apache Logs bmc_tag, add the following search criteria: bmc_tag contains Apache Logs.
  • The Search field is case-sensitive. To obtain the search results, make sure that you search with the right character case.
Search results for sample logs for different queries

Sample logs:

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Tenant configurations Initialization done.
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert
Thread-KafkaConsumer - Initializing kafka consumer...
Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

Query

Description

Search result

message:Thread-MainThread

All records where the message field contains "Thread-MainThread".

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Tenant configurations Initialization done.
Thread-MainThread - [ProcessStart=The process of log processing service started.
 Thread-MainThread - [ProcessStart=The job process of log processing service has started.
 Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
 Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

message:Thread

All records where the message field contains "Thread".

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Tenant configurations Initialization done.
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert
Thread-KafkaConsumer - Initializing kafka consumer...
Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

message:Thread*

All records where the message field contains the word Thread followed by other characters (excluding space).

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Tenant configurations Initialization done.
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert
Thread-KafkaConsumer - Initializing kafka consumer...
Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

message:Starting kafka consumer

All records where the message field contains any of the following words: Starting, Kafka, or consumer. Here, space is treated as the OR operator in the search.

Thread-MainThread - Starting log processing service.

Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.

Thread-KafkaConsumer - Initializing kafka consumer...

Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.

message:Starting Kafka con*

All records where the message field contains any of the following: Starting, Kafka, or a word that starts with con.

Thread-MainThread - Starting log processing service.

Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.

Thread-KafkaConsumer - Initializing kafka consumer...

Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.

Thread-MainThread - Consumed CPU utilization exceeded threshold.

message:"Starting kafka consumer"

All records where the message field contains the complete string "Starting Kafka consumer".

Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.

message:”Starting kafka con**”

All records where the message field contains the complete string "Starting kafka con*". Here, the * character is not treated as a wildcard character.

No results.

message:*ProcessStart=The AND message:process AND message:started*

All records where the message field contains all the following strings in a log message:

  • *ProcessStart=The
  • process
  • started*

Here, you are filtering log messages that contain all the strings that you mention in the query by using the AND operator.  

Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.

Filtering search results by time range and date

Use the following options to filter logs according to the time range and date:

  • Specify days or hours for which you want to search results.
    For example, search results for last 15 minutes or last 7 days.
  • Set specific date and time (absolute or specific).
    For example, search results for Jul 18, 2022 18:00 hours till Jul 19, 2022 18:00 hours.
    Abs_Rel_TimeRange.png

Supported time formats

The log generation time is saved in the @timestamp field. The time of the collected logs must be in the ISO 8601 ZULU format, for example, 2022-02-20T12:21:32.756Z. If the log generation time is specified in any other format, it is saved in the @@timestamp field, and the log collection time is saved in the @timestamp field. The log collection time is available in the Greenwich Mean Time (GMT) time zone. 

If you are collecting logs by using external agents such as Logstash and Filebeat, the Epoch time format is supported. However, if you are collecting logs by using the Docker, Windows, or Linux connector, the Epoch time format is not supported.

Filtering search results based on fields

You can filter logs based on their fields. The fields identified in logs are displayed in the Available fields section. Click a field and select a value to filter logs based on the field.

explorer_available_fields.png

Example

If you want to filter all logs that have a specific IP Address, perform the following steps:

  1. In the Available fields section, click the ipAddress field.
  2. Select an IP Address and click Filter.

Tip

If you see the sign instead of the data type icon of a field, refresh the index on the Stack Management > Index pattern > Index pattern name page. Or, wait for around 5 minutes when the index is refreshed automatically and then refresh your browser.

Saving searches

Save the search query that you have created by using the search field, available fields, and time period fields. In future, access the saved search to get similar results.

  1. Click Save.
  2. Enter a name.
    DiscoverPage2.png
  3. To access the saved search, click Open.

Exporting logs as reports

Use the Explorer page to export logs in the CSV format. Use the exported log reports for enhanced data analysis, effective troubleshooting, and improved collaboration by sharing the reports with other stakeholders.

The export logs feature is supported only on OpenSearch 2.x.

You can perform the following actions from the Explorer page:

  • Generate and download reports
  • View previously generated reports

To generate a report and download it in the CSV format

  1. In BMC Helix Log Analytics, navigate to the Explorer tab.
  2. Perform one of the following actions:

    • If you have a saved search, click Open and click the saved search.
    • If you don't have a saved search, click Save.

    For more information about saving the search, see Deriving-insights-from-logs.

  3. Click Reporting.
  4. In the GENERATE AND DOWNLOAD menu, click Generate CSV.
    The report is saved in the CSV format in the local directory that you specify.
    You can export a maximum of 10000 logs per report.

To view previously generated reports

  1. In BMC Helix Log Analytics, navigate to the Explorer tab.
  2. Click Reporting.
  3. In the GENERATE AND DOWNLOAD menu, click View reports to view a list of generated reports.
    You can download the generated reports in the CSV format.

Configuring report definitions

Customize log reports by using report definitions, where you can specify parameters such as the log data sources and time ranges to gather logs. You can also use report definitions to schedule reports to run later or multiple times at a scheduled cadence for consistent insights and operational efficiency improvements.

On the Reporting page, use the Report definitions section to select a report and edit or delete it.

To configure report definitions:

  1. In  BMC Helix Log Analytics , navigate to the Explorer tab.
  2. Select Reporting > View reports.
    The Reports page is displayed.
  3. Under the Report definition section, click Create to create a report definition to generate customized and automated reports for effectively analyzing log data.
    The create report definition page is displayed.
  4. From the Reports settings section, enter the following parameters:
    1. In the Name field, enter the report name.
    2. (Optional) In the Description field, enter the report description.
    3. From the Select saved search list, select a saved search for which you want to generate the report.
    4. In the Time range field, select the time range for generating reports. You can choose the custom date range.
      The default time range is last 30 minutes.
    5. In the Report trigger field, you can trigger the report immediately or schedule the reports according to your requirements.

Overview of index patterns

An index pattern refers to the configuration that defines how logs and data are ingested, organized, and indexed within the system efficient log analysis. A default index pattern is already created in BMC Helix Log Analytics. All logs are collected under this index pattern. You can neither delete it nor create a new pattern.

By default, the rollover duration of the default index pattern is 3 days. Therefore, you can see data from the last 3 days on the Explorer page.

The following index patterns are also available:

  • logarc_*
  • logml-*

The logarc_index pattern

After you enable log archiving, a new index pattern is added to the Discover page in the format logarc_*.  All the logs collected since the time archiving is enabled for your tenant are shown in the new index pattern. The data before archiving is enabled continues to show in the earlier index pattern. The archived and restored data is available only in the new index pattern. Therefore, to analyze logs that are collected after archiving is enabled, use the logarc_* index pattern.

The logml-* index pattern

After an anomaly or rare pattern is detected in logs, it is reported in a new index pattern whose format is logml-*.

 

Learn more

Read the following blog to learn how logs help you understand the health of your environment, identify issues, and track their root cause Observability with logs to accelerate MTTR. 

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*