Collecting logs by using Fluentd
Logs parsing
Logs are parsed through a parser before collection and the parsed logs are displayed in the Discover tab in BMC Helix Log Analytics. A log expression informs the parser what information is present in the logs. You can also use the expression to filter logs for collection.
Let's look at an example to help understand parsing. Here are the expression and date format for Apache. These expressions are provided for all supported formats (wherever required) when you configure a log collection.
Expression (Apache): /^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>(?:[^\"]|\\.)*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>(?:[^\"]|\\.)*)" "(?<agent>(?:[^\"]|\\.)*)")?$/
Time Format: %d/%b/%Y:%H:%M:%S %z
Log entry: 192.168.0.1 - - [28/Feb/2013:12:00:00 +0900] "GET / HTTP/1.1" 200 777 "-" "Opera/12.0"
Parsed as:
time:
1362020400 (28/Feb/2013:12:00:00 +0900)
record:
{
"user" : nil,
"method" : "GET",
"code" : 200,
"size" : 777,
"host" : "192.168.0.1",
"path" : "/",
"referer": nil,
"agent" : "Opera/12.0"
}
To parse logs with different expressions, you can either update the default expressions or use a custom format.
For more information, see
Fluentd documentation
.
Logs filtering
After the logs are parsed, you can filter the logs to include relevant log data and exclude data that you do not require. For example, you set up the following grep
configurations
Sample logs:The value of the
message
field contains cool
.
The value of the hostname
field matches web<INTEGER>.example.com
.The value of the
message
field does NOT contain uncool
.
The following logs are collected:{"message":"It's cool outside today", "hostname":"web001.example.com"}
{"message":"That's not cool", "hostname":"web1337.example.com"}
The following logs are excluded:{"message":"I am cool but you are uncool", "hostname":"db001.example.com"}
{"hostname":"web001.example.com"}
{"message":"It's cool outside today"}
Before you begin
Perform the following actions before collecting logs:
- Download and install the connector.
- If the logs are located outside the container where the connector is downloaded and installed, create mount points for such directories. For more information, see Downloading and installing connectors for third-party integrations.
To collect logs from files
In BMC Helix Developer Tools, open the Available integrations tab if it is not open by default.
- On the Collect Logs from File tile, click Configure.
On this tile, the build number of the product and the name of the entity that made the product available for configuration are displayed. In this example, the product is made available by BMC. - Enter the integration name.
- Select the connector that you downloaded and saved.
If no connector is available, click Add Connector to download and install a connector. - In the Customize Entity Configuration section, click Configure.
In the Log Collection File Path field, enter the locations to collect logs.
Enter only directory paths and an absolute file name with the path. Separate multiple entries with a comma. Ensure that all log files have the same format. Supported formats are available in the Format field.- (Optional) If you have entered a path with multiple folders and you want to exclude some folders from collection, in the Exclude Paths field, remove those folders.
For example, you have entered the log collection path as/opt/bmc/connectors/<connector_name>/logs/applicationLogs
and this folder contains the following folders:app1
,app2
,app3
. Theapp1
,app2
, andapp3
folders are shown in the Exclude Paths field. To prevent log collection from theapp3
folder, remove theapp3
folder from the field. From the Format field, select the format present in your logs and perform the appropriate steps to filter the logs.
The following table lists the available formats and the steps to use them:Format Description Apache, Apache Error, Nginx, and Regexp
For these formats, expression and supported date format are displayed in the Expression and Time Format fields. Update the expression or date format based on the expression and date format present in your log files.
Java multiline
Date format and firstline format expressions are displayed in the Format Firstline and Format 1 fields.
To parse the following sample logs:
2021-09-07 14:19:17 INFO [main] Generating some log messages 0
2021-09-07 14:19:17 INFO [main] Sleeping for 1 second.
2021-09-07 14:19:17 INFO [main] Generating some log messages 1
Modify the default expression for multiline. Here is how you can modify the out-of-the-box expression (note the square brackets location in the expressions):
Default:/^(?<time>\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2}) \[(?<thread>.*)\] (?<level>[^\s]+)(?<message>.*)/
Updated:
/^(?<time>\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2}) (?<thread>.*) \[(?<level>[^\s]+)\](?<message>.*)/
To verify the expression, visit rubular
or fluentular
.
Json
CSV
Enter the field names (separated by comma) that you want to provide to the values in the CSV file in the order they appear in the file.
For example, a CSV contains the following values:
2013/02/28 12:00:00,192.168.0.1,111,user1
2013/02/28 12:00:00,192.168.0.1,112,user2
2013/02/28 12:00:00,192.168.0.1,113,user3
For this example, enter time,host,req_ID,user.
The CSV is parsed as:
Custom Use the Custom option in the following scenarios:
- The out-of-the-box expression is different from the logs expression in your files.
- The expression in your log files contains multiple key-value pairs.
- The required log format is not listed in the field.
To use the Custom option, enter the format in the Type field and configure the expression in the form of parameter name and parameter value.
The following expressions are supported:- regexp
- apache2
- apache_error
- nginx
- csv
- json
- multiline
Example:
For multiline format type, enter the following values:Parameter Name:
format_firstline
; Parameter Value:/\d{4}-\d{1,2}-\d{1,2}/
Parameter Name:
format1
; Parameter Value:/^(?<time>\d{4}-\d{1,2}-\d{1,2} \d{1,2}:\d{1,2}:\d{1,2}) \[(?<thread>.*)\] (?<level>[^\s]+)(?<message>.*)/
To verify the expression, visit rubular
or fluentular
.
In the Tags field, enter the tags to identify logs of the specified files.
- Save the entity configuration and then save the integration.
You can view the added configuration by clicking Integrations and then Configured Integrations.
You can also see the logs being received, the connector status, and other details on the integration tile:
Troubleshooting tip
Scenario: The configured integration is showing a disconnected state
Solution: The server on which the connector is downloaded is down or it is not sending the heartbeats. Go to your virtual machine and ensure that the docker container is up and running.
To verify log collection configuration
To verify whether log collection has started, click Log Explorer > Discover. If the log format expression that you have configured is incorrect, in the /opt/bmc/connectors/<connector_name>/logs/fluent.log
file, you get Response: 201.
To verify whether the parameters are correctly populated in the fluentd pipeline, go to /opt/bmc/connectors/<connector_name> /<integration_name>/pipeline. Open the file_log_pipeline.conf file by running the cat file_log_pipeline.conf
command.
Comments
Log in or register to comment.