Working with logs


All installation logs are available in the helix-on-prem-deployment-manager/logs directory. BMC Helix Platform uses ElasticSearch, Fluentd, and Kibana (EFK) for logging.

  • ElasticSearch: An open-source search engine and object store that provides distributed REST APIs for logs.
  • Fluentd: A data collector that gathers logs from the application nodes and sends them to the Elasticsearch service.
  • Kibana: A web interface for Elasticsearch.

To start collecting logs with EFK, perform the following tasks:

  1. Create the default index pattern.
  2. Create search queries to refine your log search.

You can also export the logs to a CSV file. If you need to contact BMC Support for any issues, export the logs to a CSV file and send the file to BMC Support.


To create the default index pattern:

  1. Go to the Kibana home page.
    Use the following URL to access Kibana:
    http://<<masternodeip>>:5601/
  2. On the left menu, go to Management > Stack Management.
    index_stack_management.png
  3. Go to Kibana > Index Patterns and then click Create index pattern.
    create_index_pattern.png
  4. Define an index pattern name.
    All Fluentd logs start with the word logstash. As an example, logstash-* can be the index pattern.
  5. Click Next step.
  6. In the Time field, select @timestamp. This appends the date and time to the index pattern.
  7. Click Create index.
    In the Analytics > Discover tab, you can adjust the time filter to see all the Fluentd logs.


To refine the search by creating search queries

You can create and save search queries to refine your log search. Search queries are useful when you want to download logs to a CSV file. To create a search query, perform the following tasks:

  1. On the Kibana home page, navigate to Analytics > Discover to view your logs.
    log_list.png
  2. Filter the logs or search for the logs that you want. 
    For example, to view the nginx logs for a Kubernetes pod based on the pod name, use the following search query:
    kubernetes.pod_name : nginx
  3. Save your search.
    The saved search appears on the top of the page as shown in the following image:
    saved_search.png


To export logs to a csv file

  1. Go to the Kibana home page.
    Use the following URL to access Kibana:
    http://<<masternodeip>>:5601/
  2. Navigate to Analytics > Discover to view your logs.
    analytics_discover.png
  1. If required, filter the logs or search for the logs that you want. 
    For example, to view the nginx logs for a Kubernetes pod based on the pod name, use the following search query:
    kubernetes.pod_name : nginx
  2. Save your search.
  1. Click Share > CSV Reports.
    share.png
  2. Click Generate CSV.
    The CSV file is generated under Stack Management > Alerts and Insights > Reporting.
    report_download.png
  1. To download the file, in the Actions column, click the Download report icondownload_report_icon.png.
    For more information, see the Kibana documentation.








 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*