Deriving insight by analyzing logs
Objective
To derive insight about issues by searching and analyzing logs.
User persona
The tasks in this end-to-end use case involve the following personas:
- Tenant administrator: To configure log ingestion and its preparation
- Operator: To filter data and create a dashboard
About BMC Helix Log Analytics
BMC Helix Log Analytics enables you to collect logs from various sources and search them to find relevant information. You can also apply a structure to your unstructured log to make them easier to analyze.
BMC Helix Log Analytics is an add-on service for BMC Helix Operations Management and is hosted as a SaaS service on BMC Helix Portal. BMC Helix Log Analytics leverages the Elasticsearch, Logstash, and Kibana (ELK) stack to analyze and visualize logs.
BMC Helix Log Analytics enables you to collect logs by using Logstash and Beats (Filebeats and Winlogbeat are the supported Beats) and are stored in Elasticsearch. BMC Helix Log Analytics provides a UI to search the logs. You can save your searches and view them in dashboards. Dashboards of all users in your organization are available to you.
The following figure shows how logs are collected and made available to you for analyzing.
The following video (3:32) illustrates the configurations required to send logs to Elasticsearch.
The following video (2:53) illustrates how to use BMC Helix Log Analytics.
Collecting and analyzing logs
To save logs to Elasticsearch, configure Logstash and Beats. When they start communicating with each other and data is saved to Elasticsearch, you can use BMC Helix Log Analytics to analyze and visualize logs.
- Copy the API key of your BMC Helix Operations Management tenant and paste it in a text file. In BMC Helix Operations Management, go to Administration > Repository and click Copy API key.
- Download and install Beats on the computers from where you want to collect logs.
- Download and install Logstash
For detailed information about the files used in the configurations, see Logstash documentation .
- Configure Logstash to accept data from Beats
- From the Logstash installation folder, open the
config\logstash-sample.conf
file.
If you are configuring Logstash by using RPM on Linux operating systems, copy the/etc/logstash/logstash-sample.conf
file to the/etc/logstash/conf.d
folder and then open it. In the
input
plugin, enter the port number using which Beats send data to Logstash.input{ beats{ port=><port number (for example 5044)> } }
Note: Ensure that the port is open on the computer where Logstash is installed.
- From the Logstash installation folder, open the
Configure Logstash to send the collected logs to the REST endpoint by entering the following details to the
output
plugin in theconfig\logstash-sample.conf
file.
In Linux environments, after updating thelogstash-sample.conf
file, move it to the/etc/logstash/conf.d
folder.output{ http { url=>"https://<DNS Name that is provided by BMC>/log-service/api/v1.0/logs" http_method=>"post" content_type=>"application/json" format=>"json_batch" retry_failed=>false http_compression=>true headers => { "Content-Type" => "application/json" "Authorization" => "apiKey <API key of tenant>"} } }
(Optional) Add a structure to the logs - field:value pattern by using the grok plugin in the
config\logstash-sample.conf
file.Start Logstash by running the following command -
bin/logstash
For example for Windows -bin/logstash -f config/logstash-sample.conf
.
Note: If you have enabled firewall in your environment, open the outbound https port 443.
Configure Beats to communicate with Logstash by updating the
filebeat.yml
andwinlogbeat.yml
files, available in the installed Beats installation folder. Mark theoutput.elasticsearch
plugin as a comment and uncomment theoutput.logstash
plugin.To send data to Logstash as, add the Logstash communication port:
output.logstash: hosts: <logstash computer>:<port to communicate with logstash>
In the
type
plugin, change the value of enabled to true.Configure log sources by adding the path to the
filebeat.yml
andwinlogbeat.yml
files and start Beats.type: log enabled: true paths: - <path of log source. For example, C:\Program Files\Apache\Logs or /var/log/message>
To ensure that you collect meaningful logs only, use the include
Navigate to BMC Helix Portal > BMC Helix Log Analytics > Discover.
Note: A default index pattern is created. Do not delete it. If you create a new index pattern, ensure that the name begins with - log-xx_r14_v1. The xx value is available in the matches suggested to you.
No data in the Discover tab?
This issue can occur because of any of the following reasons:
- Logstash has not received data.
To verify if Logstash has received data, add the following to theoutput
plugin to the Logstashlogstash-sample.conf
file:stdout { codec => json }
If no data is received, check Beats configuration and network issues of the Logstash computer. - Logstash is unable to connect to BMC Helix Log Analytics. To verify, check if the following error is present in the Logstash logs -
Could not fetch URL
.
To resolve this issue, in theoutput
plugin of thelogstash-sample.conf
file, verify the following:- The API key is correct.
- The URL to connect to BMC Helix Log Analytics is correct.
UI taking time to load?
There is a known issue in Kibana that sometimes the UI takes few seconds to load.
Troubleshooting tips
- If you get error code 422 in the Logstash logs, you have exceeded the daily limit to ingest data:
- For trial users - 1 GB
- For subscribed users - 100 GB
- Ensure that a single log record contains less than 100 unique fields.
- If you get the Create Index Pattern page when you click the Discover tab, create an index pattern and ensure that the name begins with - log-xx_r14_v1*. The xx value is available in the matches suggested to you.
- If you are unable to view ingested data and the field type is not shown in the Discover tab:
go to Management > Index Pattern > default index pattern > Refresh ().
- To view the collected logs, go to BMC Helix Portal > BMC Helix Log Analytics > Discover.
All the collected logs are displayed here. - To filter data, select the required fields from the Available fields list.
In place of the data type icon of a field, if you see the '?' sign, refresh the field list on the index pattern page (Management > Index pattern > index pattern name). - Select a date range to filter data.
- (Optional) You can also use Kibana queries to filter data. For more information, see Kibana documentation .
- To save the filter criteria for later use, save it as saved search.
You can use the saved search to create a visualization.
For information about log retention and storage limit, see BMC Helix Subscriber Information .
If you have a search criteria that you think might occur repeatedly, you can save it and add it to a dashboard as a visualization.
- Click Visualize > Create new visualization.
- Select the type of visualization that you want to use.
For example, an area chart. - Select the search that you have saved.
- Apply additional filters to the data and save the visualization.
- To add the visualization to a dashboard:
- Click Dashboard.
- You can create a new dashboard or edit an existing one.
- Click Add and select the visualization.
When you see an error in the collected logs and want to be notified if it occurs again, configure an alert. When the error occurs again, an event is generated in BMC Helix Operations Management. You can configure a maximum of 20 alerts.
On the Alerts > Create an alert page, click Create new alert and do the following:
- Specify a unique name.
- Configure the query based on which alert will be generated.
Configure the conditions that define when the event is triggered.
Best practice
Run the query for last 5 minutes as the frequency to create alerts is 5 minutes in BMC Helix Log Analytics.- Select the severity level of the alert that is generated.
- Enter the message of the event and its additional details.
- Save the alert.
Comments
Log in or register to comment.