Deriving insights from logs

Get to the root cause of an issue by using out-of-the-box options such as queries, time range, and fields. You can use BMC Helix Log Analytics to get to the root cause of an issue to ensure system uptime. Take advantage of the out-of-the-box options such as queries, time range, and fields to be able to quickly find information about an issue in the log details.

Search and analyze logs on the Explorer > Discover tab. The following figure highlights the options on the Discover page that help you in getting to the root cause of an issue:


Example

You observed multiple log entries with 501 status in your Single Sign On application logs. It means that the service is unavailable and you want to keep it available all the time. Use the search options and narrow down the results to find the root cause. For quick reference, add the filtered logs to dashboards.

The following video (2:53) illustrates how to analyze and visualize logs:


Watch the YouTube video about analyzing logs in BMC Helix Log Analytics.


Index pattern overview

By default, an index pattern is created for you. All the logs are collected under this index pattern. You can neither delete this index pattern nor create a new one.

After archiving is enabled, a new index pattern is added to the Discover page in the format logarc_*.  All the logs collected since the time archiving is enabled for your tenant are shown in the new index pattern. The data before archiving is enabled continues to show in the earlier index pattern. Archived and restored data are available in the new index pattern only. Therefore, to analyze logs collected after archiving is enabled, use the logarc_* index pattern.

After an anomaly or rare pattern is detected in logs, it is reported in a new index pattern whose format is logml-*. 

Options to search for a specific information

Use the following options to search for a specific alphanumeric string:

  • Search field: Enter the string that you are looking for in a field. The format is: field_name:"search string". For example, to search for all logs where status 501 is reported, enter status:501.
  • Filter: Click Add Filter and select a field. Operators are available as per the data type of the selected field. Enter the string and save the filter. For example, loglevel.keyword is error.

Sample logs:

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Tenant configurations Initialization done.
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert
Thread-KafkaConsumer - Initializing kafka consumer...
Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

QueryDescriptionSearch result
message:Thread-MainThreadAll records where the message field contains "Thread-MainThread".

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Tenant configurations Initialization done.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

message:ThreadAll records where the message field contains "Thread".

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Tenant configurations Initialization done.
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert
Thread-KafkaConsumer - Initializing kafka consumer...
Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

message:Thread*All records where the message field contains the word Thread followed by other characters (excluding space).

Thread-MainThread - Starting log processing service...
Thread-MainThread - Initializing tenant configurations
Thread-MainThread - Fetch properties for tenant=bmc
Thread-MainThread - Initialized tenant configurations for tenant=bmc
Thread-MainThread - Fetch properties for tenant=hp
Thread-MainThread - Initialized tenant configurations for tenant=hp
Thread-MainThread - Consumed CPU utilization exceeded threshold.
Thread-MainThread - Tenant configurations Initialization done.
Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert
Thread-KafkaConsumer - Initializing kafka consumer...
Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.
Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.
Thread-MainThread - [ProcessStart=The job process of log processing service has finished.
Thread-MainThread - [ProcessStart=The job process of log processing service has terminated.

message:Starting kafka consumerAll records where the message field contains any of the following words: Starting, Kafka, or consumer. Here, space is treated as the OR operator in the search.

Thread-MainThread - Starting log processing service.

Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.

Thread-KafkaConsumer - Initializing kafka consumer...

Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.

message:Starting Kafka con*All records where the message field contains any of the following: Starting, Kafka, or a word that starts with con.

Thread-MainThread - Starting log processing service.

Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.

Thread-KafkaConsumer - Initializing kafka consumer...

Thread-KafkaConsumer - Kafka consumer alert_kakfa_consumer started.

Thread-MainThread - Consumed CPU utilization exceeded threshold.

message:"Starting kafka consumer"All records where the message field contains the complete string "Starting Kafka consumer".

Thread-KafkaConsumer - Starting kafka consumer alert_kakfa_consumer for topic=alert.

message:”Starting kafka con**”

All records where the message field contains the complete string "Starting kafka con*". Here, the * character is not treated as a wildcard character.

No results.

message:*ProcessStart=The AND message:process AND message:started*

All records where the message field contains all the following strings in a log message:

  • *ProcessStart=The
  • process
  • started*

Here, you are filtering log messages that contain all the strings that you mention in the query by using the AND operator.  

Thread-MainThread - [ProcessStart=The process of log processing service started.
Thread-MainThread - [ProcessStart=The job process of log processing service has started.

Options to filter search results by time range and date

You get the following options to set the date to narrow down your search results:

  • Specify days or hours for which you want to search results. For example, search results for last 15 minutes or last 7 days.
  • Set specific date and time (absolute or specific). For example, search results for Jul 18, 2022 18:00 hours till Jul 19, 2022 18:00 hours.

Supported time formats

The log generation time is saved in the @timestamp field. Time of the collected logs must be in the ISO 8601 ZULU format; for example, 2022-02-20T12:21:32.756Z. If the log generation time is specified in any other format, it is saved in the @@timestamp field, and the log collection time is saved in the @timestamp field. The log collection time is available in the Greenwich Mean Time (GMT) time zone. 

If you are collecting logs by using an external agent like Logstash and Filebeat, the Epoch time format is supported. However, if you are collecting logs by using the Docker, Windows, or Linux connector, the Epoch time format is not supported. 

Fields available to filter logs

The fields identified in the logs are displayed in the Available fields section. Click a field and select a value to filter logs based on the field. For example, click the ipAddress field and select an IP address to search for all logs where ipAddress is the selected value. To add a field as a column in the search result, click the + symbol that is shown when you move your mouse over the field name.  

Tip

In place of the data type icon of a field, if you see the '?' sign, refresh the index on the index pattern page (Stack Management > Index pattern > index pattern name). Or, wait for around 5 minutes when the index is refreshed automatically and then refresh your browser.

After using these options that help you analyze logs, you see the filtered records that require immediate attention:


To save the search

Save the search query that you have created by using the search field, available fields, and time period fields. In future, access the saved search to get similar results.

  1. Click Save.
  2. Enter a name.
  3. To access the saved search, click Open.

To add the saved search to a visualization

  1. Select Visualize > Create new visualization.
  2. Select the type of visualization that you want to use.
    For example, a line chart.
  3. Select the search that you have saved.
  4. Apply additional filters to the data and save the visualization.
  5. To add the visualization to a dashboard:
    1. Click Dashboard.
    2. Create a new dashboard or edit an existing one.
    3. Click Add and select the visualization.

Where to go from here

Visualizing logs

Learn more

Read the following blog to learn how logs help you in understanding the health of your environment, identify issues, and track their root cause: Observability with logs to accelerate MTTR Open link . 

Was this page helpful? Yes No Submitting... Thank you

Comments