This documentation supports releases of BMC Helix Log Analytics up to December 31, 2021. To view the latest version, select the version from the Product version menu.

Collecting logs by using Logstash and Filebeat

Collect logs from various sources and search them to find relevant information. You can also apply a structure to your unstructured log to make them easier to analyze.

Filebeats and Winlogbeat are the supported Beats. BMC Helix Log Analytics provides a UI to search the logs. You can save your searches and view them in dashboards. Dashboards of all users in your organization are available to you.

The following figure shows how logs are collected and made available to you for analyzing.

The following video (3:32) illustrates the configurations required to collect logs.


https://youtu.be/5dXTTZcvth0

Before you begin

To prepare to configure log collection

  • Copy the API key of your BMC Helix Operations Management tenant and paste it in a text file. In BMC Helix Operations Management, go to Administration > Repository and click Copy API key.
  • Download and install Beats Open link  on the computers from where you want to collect logs.
  • Download and install Logstash Open link

To configure Logstash

For detailed information about the files used in the configurations, see  Logstash documentation Open link .

  1. Configure Logstash to accept data from Beats
    1. From the Logstash installation folder, open the config\logstash-sample.conf file.
      If you are configuring Logstash by using RPM on Linux operating systems, copy the /etc/logstash/logstash-sample.conf file to the /etc/logstash/conf.d folder and then open it.
    2. In the input plugin, enter the port number using which Beats send data to Logstash.

      input{
       beats{
          port=><port number (for example 5044)>
            }
           }


      Note: Ensure that the port is open on the computer where Logstash is installed.

  2. Configure Logstash to send the collected logs to the REST endpoint by entering the following details to the output plugin in the config\logstash-sample.conf file.
    In Linux environments, after updating the logstash-sample.conf file, move it to the /etc/logstash/conf.d folder.

    output{
       http
          {
          url=>"https://<DNS Name that is provided by BMC>/log-service/api/v1.0/logs"
          http_method=>"post"
          content_type=>"application/json"
          format=>"json_batch"
          retry_failed=>false
          http_compression=>true
          headers => {
          "Content-Type" => "application/json"
          "Authorization" => "apiKey <API key of  tenant>"}
          }
          }
  3. (Optional) Add a structure to the logs - field:value pattern by using the grok plugin in the config\logstash-sample.conf file.

    input {
      file {
        type => "apachelog"
        path => ["C:/logs/apachelogs.log"]
        start_position => "beginning"
      }
    }
    filter {
    if [type] == "apachelog" {
            grok {
                match => {"message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-)'}
            }
        }
    }
  4. Start Logstash by running the following command - bin/logstash
    For example for Windows - bin/logstash -f config/logstash-sample.conf.
    Note: If you have enabled firewall in your environment, open the outbound https port 443.

To configure Beats

  1. Configure Beats to communicate with Logstash by updating the filebeat.yml and winlogbeat.yml files, available in the installed Beats installation folder. Mark the output.elasticsearch plugin as a comment and uncomment the output.logstash plugin.

  2. To send data to Logstash as, add the Logstash communication port:


    output.logstash:
    hosts: <logstash computer>:<port to communicate with logstash>
  3. In the type plugin, change the value of enabled to true.

  4. Configure log sources by adding the path to the filebeat.yml and winlogbeat.yml files and start Beats.

    type: log
    enabled: true
    paths:
      - <path of log source. For example, C:\Program Files\Apache\Logs or /var/log/message>
  5. To ensure that you collect meaningful logs only, use include.

To verify log collection

Navigate to BMC Helix Log Analytics > Discover.

Note: A default index pattern is created. Do not delete it. If you create a new index pattern, ensure that the name begins with - log-xx_r14_v1. The xx value is available in the matches suggested to you.

To troubleshoot log collection

The following table lists the possible scenarios that you might run into while collecting logs and the steps that you can perform to troubleshoot the issue:

ScenarioPossible actions
No data in the Discover tab

This issue can occur because of any of the following reasons:

  • Logstash has not received data.
    To verify if Logstash has received data, add the following to the output plugin to the Logstash logstash-sample.conf file:
    stdout { codec => json }
    If no data is received, check Beats configuration and network issues of the Logstash computer.
  • Logstash is unable to connect to BMC Helix Log Analytics. To verify, check if the following error is present in the Logstash logs - Could not fetch URL.
    To resolve this issue, in the output plugin of the logstash-sample.conf file, verify the following:
    • The API key is correct.
    • The URL to connect to BMC Helix Log Analytics is correct.
UI taking time to loadThere is a known issue in Kibana that sometimes the UI takes few seconds to load.
Error code 422

If you get error code 422 in the Logstash logs, you have exceeded the daily limit to ingest data.

For information, see BMC Helix Subscriber Information Open link .

Create Index Pattern page when you click the Discover tabCreate an index pattern and ensure that the name begins with - log-xx_r14_v1*. The xx value is available in the matches suggested to you.

Unable to view ingested data and the field type is not shown in the Discover tab:

Go to Management > Index Pattern > default index pattern > Refresh ().



Related topic

Knowledge article to create syslog server

Was this page helpful? Yes No Submitting... Thank you

Comments