Deriving insight by analyzing logs
Objective
To derive insight about issues by searching and analyzing logs.
User persona
The tasks in this end-to-end use case involve the following personas:
- Tenant administrator: To configure log ingestion and its preparation
- Operator: To filter data and create a dashboard
About BMC Helix Log Analytics
BMC Helix Log Analytics enables you to collect logs from various sources and search them to find relevant information. You can also apply a structure to your unstructured log to make them easier to analyze.
BMC Helix Log Analytics is an add-on service for BMC Helix Operations Management and is hosted as a SaaS service on BMC Helix Portal. BMC Helix Log Analytics leverages the Elasticsearch, Logstash, and Kibana (ELK) stack to analyze and visualize logs.
BMC Helix Log Analytics enables you to collect logs by using Logstash and Beats (Filebeats and Winlogbeat are the supported Beats) and are stored in Elasticsearch. BMC Helix Log Analytics provides a UI to search the logs. You can save your searches and view them in dashboards. Dashboards of all users in your organization are available to you.
The following figure shows how logs are collected and made available to you for analyzing.
The following video (3:32) illustrates the configurations required to send logs to Elasticsearch.
The following video (2:53) illustrates how to use BMC Helix Log Analytics.
Collecting and analyzing logs
To save logs to Elasticsearch, configure Logstash and Beats. When they start communicating with each other and data is saved to Elasticsearch, you can use BMC Helix Log Analytics to analyze and visualize logs.
Step 1: Prepare to configure log ingestion
- Copy the API key of your BMC Helix Operations Management tenant and paste it in a text file. In BMC Helix Operations Management, go to Administration > Repository and click Copy API key.
Download and install Beats on the computers from where you want to collect logs.
Step 2: Configure Logstash
For detailed information about the files used in the configurations, see Logstash documentation.
- Configure Logstash to accept data from Beats
- From the Logstash installation folder, open the config\logstash-sample.conf file.
If you are configuring Logstash by using RPM on Linux operating systems, copy the /etc/logstash/logstash-sample.conf file to the /etc/logstash/conf.d folder and then open it. In the input plugin, enter the port number using which Beats send data to Logstash.
input{
beats{
port=><port number (for example 5044)>
}
}Note: Ensure that the port is open on the computer where Logstash is installed.
- From the Logstash installation folder, open the config\logstash-sample.conf file.
Configure Logstash to send the collected logs to the REST endpoint by entering the following details to the output plugin in the config\logstash-sample.conf file.
In Linux environments, after updating the logstash-sample.conf file, move it to the /etc/logstash/conf.d folder.output{
http
{
url=>"https://<DNS Name that is provided by BMC>/log-service/api/v1.0/logs"
http_method=>"post"
content_type=>"application/json"
format=>"json_batch"
retry_failed=>false
http_compression=>true
headers => {
"Content-Type" => "application/json"
"Authorization" => "apiKey <API key of tenant>"}
}
}(Optional) Add a structure to the logs - field:value pattern by using the grok plugin in the config\logstash-sample.conf file.
- Start Logstash by running the following command - bin/logstash
For example for Windows - bin/logstash -f config/logstash-sample.conf.
Note: If you have enabled firewall in your environment, open the outbound https port 443.
Step 3: Configure Beats
- Configure Beats to communicate with Logstash by updating the filebeat.yml and winlogbeat.yml files, available in the installed Beats installation folder. Mark the output.elasticsearch plugin as a comment and uncomment the output.logstash plugin.
To send data to Logstash as, add the Logstash communication port:
output.logstash:
hosts: <logstash computer>:<port to communicate with logstash>- In the type plugin, change the value of enabled to true.
Configure log sources by adding the path to the filebeat.yml and winlogbeat.yml files and start Beats.
type: log
enabled: true
paths:
- <path of log source. For example, C:\Program Files\Apache\Logs or /var/log/message>- To ensure that you collect meaningful logs only, use the include
Step 4: Verify log collection
Navigate to BMC Helix Portal > BMC Helix Log Analytics > Discover.
Note: A default index pattern is created. Do not delete it. If you create a new index pattern, ensure that the name begins with - log-xx_r14_v1. The xx value is available in the matches suggested to you.
Step 5: Filter data to analyze
To view the collected logs, go to
BMC Helix Portal
>
BMC Helix Log Analytics
> Discover.
All the collected logs are displayed here.- To filter data, select the required fields from the Available fields list.
In place of the data type icon of a field, if you see the '?' sign, refresh the field list on the index pattern page (Management > Index pattern > index pattern name). - Select a date range to filter data.
(Optional) You can also use Kibana queries to filter data. For more information, see Kibana documentation.
To save the filter criteria for later use, save it as saved search.
You can use the saved search to create a visualization.
For information about log retention and storage limit, see BMC Helix Subscriber Information.
Step 6: Create a dashboard
If you have a search criteria that you think might occur repeatedly, you can save it and add it to a dashboard as a visualization.
- Click Visualize > Create new visualization.
- Select the type of visualization that you want to use.
For example, an area chart. - Select the search that you have saved.
- Apply additional filters to the data and save the visualization.
- To add the visualization to a dashboard:
- Click Dashboard.
- You can create a new dashboard or edit an existing one.
- Click Add and select the visualization.
Step 7: Create an alert
When you see an error in the collected logs and want to be notified if it occurs again, configure an alert. When the error occurs again, an event is generated in BMC Helix Operations Management. You can configure a maximum of 20 alerts.
On the Alerts > Create an alert page, click Create new alert and do the following:
- Specify a unique name.
- Configure the query based on which alert will be generated.
Configure the conditions that define when the event is triggered.
- Select the severity level of the alert that is generated.
- Enter the message of the event and its additional details.
- Save the alert.