Deriving insights by analyzing logs


To derive insights about issues by searching and analyzing logs.

User persona

The tasks in this end-to-end use case involve the following personas:

  • Tenant administrator: To configure log ingestion and its preparation 
  • Operator: To filter data and create a dashboard

About BMC Helix Log Analytics

BMC Helix Log Analytics enables you to collect logs from various sources and search them to find relevant information. You can also apply a structure to your unstructured log to make them easier to analyze.

BMC Helix Log Analytics is an add-on service for BMC Helix Operations Management and is hosted as a SaaS service on BMC Helix Portal. BMC Helix Log Analytics leverages the Elasticsearch, Logstash, and Kibana (ELK) stack to analyze and visualize logs. 

BMC Helix Log Analytics enables you to collect logs by using Logstash and Beats (Filebeats and Winlogbeat are the supported Beats) and are stored in Elasticsearch. BMC Helix Log Analytics provides a UI to search the logs. You can save your searches and view them in dashboards. Dashboards of all users in your organization are available to you.

The following figure shows how logs are collected and made available to you for analyzing.

The following video (3:32) illustrates the configurations required to send logs to Elasticsearch.

The following video (2:53) illustrates how to use BMC Helix Log Analytics.

Collecting and analyzing logs

To save logs to Elasticsearch, configure Logstash and Beats. When they start communicating with each other and data is saved to Elasticsearch, you can use BMC Helix Log Analytics to analyze and visualize logs.

Step 1: Prepare to configure log ingestion
  • Copy the API key of your BMC Helix Operations Management tenant and paste it in a text file. In BMC Helix Operations Management, go to Administration > Repository and click Copy API key.
  • Download and install Beats on the computers from where you want to collect logs.
  • Download and install Logstash
Step 2: Configure Logstash

For detailed information about the files used in the configurations, see Logstash documentation .

  1. Configure Logstash to accept data from Beats
    1. From the Logstash installation folder, open the config\logstash-sample.conf file.
      If you are configuring Logstash by using RPM on Linux operating systems, copy the /etc/logstash/logstash-sample.conf file to the /etc/logstash/conf.d folder and then open it.
    2. In the input plugin, enter the port number using which Beats send data to Logstash.

          port=><port number (for example 5044)>

      Note: Ensure that the port is open on the computer where Logstash is installed.

  2. Configure Logstash to send the collected logs to the REST endpoint by entering the following details to the output plugin in the config\logstash-sample.conf file.
    In Linux environments, after updating the logstash-sample.conf file, move it to the /etc/logstash/conf.d folder.

          url=>"https://<DNS Name that is provided by BMC>/log-service/api/v1.0/logs"
          headers => {
          "Content-Type" => "application/json"
          "Authorization" => "apiKey <API key of  tenant>"}
  3. (Optional) Add a structure to the logs - field:value pattern by using the grok plugin in the config\logstash-sample.conf file.

    input {
      file {
        type => "apachelog"
        path => ["C:/logs/apachelogs.log"]
        start_position => "beginning"
    filter {
    if [type] == "apachelog" {
            grok {
                match => {"message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-)'}
  4. Start Logstash by running the following command - bin/logstash
    For example for Windows - bin/logstash -f config/logstash-sample.conf.
    Note: If you have enabled firewall in your environment, open the outbound https port 443.

Step 3: Configure Beats
  1. Configure Beats to communicate with Logstash by updating the filebeat.yml and winlogbeat.yml files, available in the installed Beats installation folder. Mark the output.elasticsearch plugin as a comment and uncomment the output.logstash plugin.

  2. To send data to Logstash as, add the Logstash communication port:

    hosts: <logstash computer>:<port to communicate with logstash>
  3. In the type plugin, change the value of enabled to true.

  4. Configure log sources by adding the path to the filebeat.yml and winlogbeat.yml files and start Beats.

    type: log
    enabled: true
      - <path of log source. For example, C:\Program Files\Apache\Logs or /var/log/message>

  5. To ensure that you collect meaningful logs only, use the include

Step 4: Verify log collection

Navigate to BMC Helix Portal > BMC Helix Log Analytics > Discover.

Note: A default index pattern is created. Do not delete it. If you create a new index pattern, ensure that the name begins with - log-xx_r14_v1. The xx value is available in the matches suggested to you.

No data in the Discover tab?

This issue can occur because of any of the following reasons:

  • Logstash has not received data.
    To verify if Logstash has received data, add the following to the output plugin to the Logstash logstash-sample.conf file:
    stdout { codec => json }
    If no data is received, check Beats configuration and network issues of the Logstash computer.
  • Logstash is unable to connect to BMC Helix Log Analytics. To verify, check if the following error is present in the Logstash logs - Could not fetch URL.
    To resolve this issue, in the output plugin of the logstash-sample.conf file, verify the following:
    • The API key is correct.
    • The URL to connect to BMC Helix Log Analytics is correct.

UI taking time to load?

There is a known issue in Kibana that sometimes the UI takes few seconds to load.

Troubleshooting tips

  • If you get error code 422 in the Logstash logs, you have exceeded the daily limit to ingest data:
    • For trial users - 1 GB
    • For subscribed users - 100 GB
  • Ensure that a single log record contains less than 100 unique fields.
  • If you get the Create Index Pattern page when you click the Discover tab, create an index pattern and ensure that the name begins with - log-xx_r14_v1*. The xx value is available in the matches suggested to you.
Step 5: Filter data to analyze
  1. To view the collected logs, go to BMC Helix Portal > BMC Helix Log Analytics > Discover.
    All the collected logs are displayed here.
  2. To filter data, select the required fields from the Available fields list.
    In place of the data type icon of a field, if you see the '?' sign, refresh the field list on the index pattern page (Management > Index pattern > index pattern name).
  3. Select a date range to filter data.
  4. (Optional) You can also use Kibana queries to filter data. For more information, see  Kibana documentation .
  5. To save the filter criteria for later use, save it as saved search.
    You can use the saved search to create a visualization.
Step 6: Create a dashboard

If you have a search criteria that you think might occur repeatedly, you can save it and add it to a dashboard as a visualization.

  1. Click Visualize > Create new visualization.
  2. Select the type of visualization that you want to use.
    For example, an area chart.
  3. Select the search that you have saved.
  4. Apply additional filters to the data and save the visualization.
  5. To add the visualization to a dashboard:
    1. Click Dashboard.
    2. You can create a new dashboard or edit an existing one.
    3. Click Add and select the visualization.

Was this page helpful? Yes No Submitting... Thank you