Connecting BMC AMI Datastream to Logstash
To run Logstash
Decide which platform to use to run Logstash. While you can run Logstash on z/OS, we recommend that you run it on a distributed platform. Running Logstash on z/OS makes use of z Application Assist Processor (zAAP), z Systems Integrated Information Processor (zIIP), or Integrated Facility for Linux (IFL). Logstash's intensive use of these specialized processors might make them unavailable for other eligible workloads.
You can run Logstash on any of the following platforms:
- On an environment that is supported by a standard Logstash distribution, such as Linux on x86-64. In this case, follow the instructions published for your platform.
- On Linux on IBM Z by using any of the following methods:
Download the Logstash source file and build it on your computer. For more information about IBM scripts to build open-source software (OSS) packages on z/Linux, see Scripts. For more information about building Logstash on z/Linux, see Building Logstash.
- Run Logstash in a docker container by using any of the following methods:
Follow the standard practice for running OSS packages by building your own container image with docker. For more information about IBM compose files for OSS packages, see Dockerfile examples.
Use docker-compose. For more information, see s390x container logging.
Use the dockerfiles provided by IBM to build a container image and run it on zCX instead of zLiunx. For more information, see Dockerfile examples.
- On IBM z/OS by using any of the following methods:
- Run Logstash in an IBM zCX container. Build the container by using the procedure for building the docker container that was explained earlier.
- Run as a Unix System Services (USS) process. Download the intel x64 package, unzip it, and configure it to use Java version 11 installed on the computer, instead of the included JRE.
Run in a started task using IBM JZOS. For instructions, see the IBM documentation.
Download and install Logstash from the Elastic website. You can find the information about how to use Logstash and other relevant information on the product website. For more information, see Elastic Logstash.
To connect BMC AMI Datastream to Logstash
Connecting BMC AMI Datastream to Logstash is similar to connecting BMC AMI Datastream to any other SIEM or server type.
Define the transport protocol by specifying TCP as the value for TRANSport parameter. For more information about the TRANSport parameter, see SERVER-statement.
BMC AMI Datastream sends data in multiple formats such as JSON, Splunk, and Syslog. Logstash has multiple input and output plug-ins to receive and send data for a variety of SIEM types. You must use an input and output plug-in with Logstash to send and receive data. Define the format of messages that you want to send by defining the SIEMtype parameter. For more information about the SIEMtype parameter, see OPTIONS-statement. We recommend that you specify AMIJson as the value for SIEMtype parameter. Setting the SIEMtype parameter as AMIJson generates a JSON output with the following additional data fields:
Data element | Description |
---|---|
X-BMC-AGENT | Name of the BMC AMI Datastream instance |
X-BMC-AGENT-SEQ | Ten-digit message sequence number |
Product | Name of the product (BMC AMI Datastream) |
Version | Version number of the product |
@timestamp | Date and time in yyyy-mm-ddThh:mm:ss.thmZ format |
Sample Logstash configuration
You can use the following sample JCL configuration to configure BMC AMI Datastream to send messages from BMC AMI Datastream to Logstash and send the processed output from Logstash to Kafka and Elasticsearch:
tcp {
port => <portNumber>
codec => json_lines
}
}
filter {
date {
match => [ "Time", "yyyy-MM-dd'T'HH:mm:ss.SSS" ]
}
}
output {
kafka {
bootstrap_servers => "<kafkaBoostrapHostName>:9092"
topic_id => "quickstart"
codec => json {}
}
elasticsearch {
hosts => "https://<elasticHostName>:<portNumber>"
index => "<targetIndex>"
api_key => "<generatedApiKey>"
ssl_certificate_verification => false
}
}
You can secure the connection between BMC AMI Datastream and a remote (non-mainframe) Logstash instance by using Application Transport Layer Security (AT-TLS).