Collecting logs by using Logstash and Filebeat

Collect logs from various sources and search them to find relevant information. You can also apply a structure to your unstructured log to make them easier to analyze.

Filebeats and Winlogbeat are the supported Beats. BMC Helix Log Analytics provides a UI to search the logs. You can save your searches and view them in dashboards. Dashboards of all users in your organization are available to you.

The following figure shows how logs are collected and made available to you for analyzing.

The following video (3:32) illustrates the configurations required to collect logs.


https://youtu.be/5dXTTZcvth0

Before you begin

To prepare to configure log collection

  • Copy the API key of your BMC Helix Operations Management tenant and paste it in a text file. In BMC Helix Operations Management, go to Administration > Repository and click Copy API key.
  • Download and install Beats Open link  on the computers from where you want to collect logs.
  • Download and install Logstash Open link

To configure Logstash

For detailed information about the files used in the configurations, see  Logstash documentation Open link .

  1. Configure Logstash to accept data from Beats
    1. From the Logstash installation folder, open the config\logstash-sample.conf file.
      If you are configuring Logstash by using RPM on Linux operating systems, copy the /etc/logstash/logstash-sample.conf file to the /etc/logstash/conf.d folder and then open it.
    2. In the input plugin, enter the port number using which Beats send data to Logstash.
      input{
          beats{
              port=><port number (for example 5044)>
               }
            }

      Note: Ensure that the port is open on the computer where Logstash is installed.

  2. Configure Logstash to send the collected logs to the REST endpoint by entering the following details to the output plugin in the config\logstash-sample.conf file.
    In Linux environments, after updating the logstash-sample.conf file, move it to the /etc/logstash/conf.d folder.
    output{
             http
                    {
                     url=>"https://<Tenant URL provided by BMC>/log-service/api/v1.0/logs"
                     http_method=>"post"
                     content_type=>"application/json"
                     format=>"json_batch"
                     retry_failed=>false
                     http_compression=>true
                     headers => {
                         "Content-Type" => "application/json"
                         "Authorization" => "apiKey <API key of tenant>"}
                     }
                     }

  3. (Optional) Add a structure to the logs - field:value pattern by using the grok plugin in the config\logstash-sample.conf file.

    input {

        file {

            type => "apachelog"

            path => ["C:/logs/apachelogs.log"]

            start_position => "beginning"

             }

             }

     filter {

         if [type] == "apachelog" {

            grok { match => {"message" => '%{IPORHOST:clientip} %{USER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-)'}

             }

          }

        }

  4. (Optional) If you want convert the time zone of the collected logs, use the date filter.

    date {

               match => ["tmpstamp", "MMM dd, yyyy hh:mm:ss a"]

               target => "@timestamp"

               locale => "en_US"

               timezone => "UTC"

          }

  5. Start Logstash by running the following command - bin/logstash
    For example for Windows - bin/logstash -f config/logstash-sample.conf.
    Note: If you have enabled firewall in your environment, open the outbound https port 443.

To configure Beats

  1. Configure Beats to communicate with Logstash by updating the filebeat.yml and winlogbeat.yml files, available in the installed Beats installation folder. Mark the output.elasticsearch plugin as a comment and uncomment the output.logstash plugin.

  2. To send data to Logstash as, add the Logstash communication port:
    output.logstash:
    hosts: <logstash computer>:<port to communicate with logstash>

  3. In the type plugin, change the value of enabled to true.

  4. Configure log sources by adding the path to the filebeat.yml and winlogbeat.yml files and start Beats.

    type: log
    enabled: true
    paths:
        - <path of log source. For example, C:\Program Files\Apache\Logs or /var/log/message>

  5. To ensure that you collect meaningful logs only, use include.

    Note

    The following settings in the .yml files will be ineffective:

    • Elasticsearch template setting
    • Dashboards (these settings are for Kibana Dashboards)
    • Kibana (these settings are for dashboards loaded via the Kibana API)


To verify log collection

Navigate to BMC Helix Log Analytics > Discover.

Note: A default index pattern is created. You cannot create a new index pattern or delete the existing one.

To troubleshoot log collection

The following table lists the possible scenarios that you might run into while collecting logs and the steps that you can perform to troubleshoot the issue:

ScenarioPossible actions
No data in the Discover tab

This issue can occur because of any of the following reasons:

  • Logstash has not received data.
    To verify if Logstash has received data, add the following to the output plugin to the Logstash logstash-sample.conf file:
    stdout { codec => json }
    If no data is received, check Beats configuration and network issues of the Logstash computer.
  • Logstash is unable to connect to BMC Helix Log Analytics. To verify, check if the following error is present in the Logstash logs - Could not fetch URL.
    To resolve this issue, in the output plugin of the logstash-sample.conf file, verify the following:
    • The API key is correct.
    • The URL to connect to BMC Helix Log Analytics is correct.
UI taking time to loadThere is a known issue in Kibana that sometimes the UI takes few seconds to load.
Error code 422

If you get error code 422 in the Logstash logs, you have exceeded the daily limit to ingest data.

For information, see BMC Helix Subscriber Information Open link .

Unable to view ingested data and the field type is not shown in the Discover tab:

Go to Stack Management > Index Pattern > default index pattern > Refresh ().



Related topic

Knowledge article to create syslog server

Was this page helpful? Yes No Submitting... Thank you

Comments

  1. Shreya Kapse

    In this point, can we please mention to have 'TenantURL' instead of 'DNS' provided since it is more easier to understand.

    Current: url=>"https:///log-service/api/v1.0/logs"

    Should be changed to: url=>"https:///log-service/api/v1.0/logs"

    and if an example can be added, that would be great!

    Thanks.

    Sep 21, 2022 05:26
    1. Swati Malhotra

      Done.

      Thanks and regards,

      Swati

      Sep 22, 2022 05:50