his section contains information about enhancements in version 2.1.00 of the BMC TrueSight IT Data Analytics product.T
In addition to IT Data Analytics, you can now use BMC Atrium Single-Sign On for authentication of users. Using this authentication is recommended if you want to use IT Data Analytics in conjunction with BMC TrueSight Presentation Server.
For more information, see User authentication options.
With this support, user authorization is handled differently. The following table describes the changes in the user authentication approaches.
|Version 2.0 and earlier||Version 2.1 and later|
The IT Data Analytics product has been tested on operating system platforms and browsers in the following languages. Additionally, you can collect and search data in these languages:
Thus, the product supports the following capabilities:
To see the list of capabilities available in the supported languages, see Language information.
With version 2.1.00, you can create IT Data Analytics dashlets in BMC TrueSight Presentation Server (on the TrueSight console).
This feature requires you to first configure BMC Atrium Single Sign-On in IT Data Analytics.
The following table provides links that you can to understand the dashlets creation process.
|Learn how to create dashlets|
|Learn how to configure Atrium Single-Sign On||Deploying IT Data Analytics with BMC Atrium Single Sign-On|
Version 2.1.00 provides you the capability of creating a data pattern by using a simple wizard that automatically identifies the:
To create the data pattern, you need to run the wizard and provide sample data from your data file. The wizard automatically detects the date format based on the sample data. You can even customize this format. After looking at the date format, you can choose to directly save the data pattern or perform advanced field extraction. For more information, see Creating data patterns.
Editing and cloning of data patterns is supported in the same way as previous versions of the product. For more information, see Editing or cloning data patterns.
DATA COLLECTION EASE OF USE
You can now enable redundancy for data collection by including the Collection Stations in one common pool.
Redundancy can help in scenarios where the Collection Station goes down or fails for some reason. In such a scenario, the Collection Agents communicating with that Collection Station are redirected to the other Collection Stations operating in the same pool.
Thus, redundancy can minimize losing data mid-way when the Collection Station goes down. For more information, see Data collection redundancy.
You can now compare search results not only across different time contexts, but also against a different search query.
Also, the UI for comparing results has been simplified for a better user experience.
You can compare results in the following ways:
For more information, see Comparing results.
ANALYTICS SEARCH EASE OF USE
You can now collect all kinds of events (or logs) configured in your Windows environment. This means while creating the data collector, you can select all the log types that you want to collect and analyze. Note that to be able to collect all kinds of events, you need to use version 2.1.00 of the Collection Station (or Collection Agent). If you continue to use an Agent from an earlier version, then you can only collect the Application, Security, and System logs.
You can collect events both locally and remotely. To enable remote event collection, you need to perform some configuration steps. These steps vary as compared to the earlier release and based on other factors such as, whether you want to use a Windows or Linux computer as your collection host. For more information about the various factors and the configuration steps, see Configurations required before collecting Windows events.
You can collect log4j output directly into IT Data Analytics over the TCP or UDP protocol.
To do this, you need to add the SyslogAppender in the log4j .properties file that you want to collect and then create the Receive over TCP/UDP data collector.
For more information, see Use case for sending log4j output directly into IT Data Analytics.
You can use the following new field types to save decimal floating point field values.
This feature can help you run statistical operations such as finding the minimum (or smallest) value, maximum (or largest) value, average of values, and so on. This can done by running statistical search commands such as stats and timechart.
These field types can be assigned at the time of planning the fields that you want to extract, at the time of creating a data pattern.
For more information, see Understanding field types.
While specifying the search criteria, you can use search tools for selecting certain default fields and the tags present in the system. This capability helps you search more effectively and find exactly what you are looking for without knowing the exact search syntax.
Additionally, this capability simplifies your search syntax and avoids the need for specifying a long and complex search string (especially while running search commands).
For more information, see Filtering your search results.
SEARCH EASE OF USE
On the Search page, as you type your search query, you can see a list of type-ahead search suggestions.
These suggestions are automatically generated based on the most frequently run searches which includes:
For more information, see Type-ahead search suggestions.
SEARCH EASE OF USE
Compared to previous releases search command combinations listed in the following table produce better results. Earlier, these search command combinations produced results that were limited to a particular value. This value is determined by the value of the search.events.fetch.limit property located in the searchserviceCustomConfig.properties file.
For more information, see Modifying the configuration files.
The following table provides search command combinations that produce better results:
|Search command combinations||Example search queries|
The filter command succeeded by one or more of the following commands:
The extract command succeeded by one or more of the following commands when the extracted field is not used in the subsequent commands:
* | extract field=".*flower_store/(?<Screen>\w+).*" |
filter ISNOTNULL(response) | rare limit=10 HOST by item_id
In addition to the Text View and Chart View, a new Table View is available, that displays search results in the table format. In the Table View, each column in the table represents a field, while each row represents the individual record (or event) categorized into columns based on fields.
This capability enables you to read and understand your data better. It also enables you to easily identify important portions of the data represented in the form of fields. Furthermore, you can control the number of fields (represented as columns) that must be displayed in the table. You can even resize the columns, add particular portions of the data to your search criteria and, add field names to the Filters panel.
For more information, see Viewing and understanding search results.
SEARCH EASE OF USE
While creating notifications, you can provides script paths as your notification destination. Each time the condition for sending a notification is met (for example, Number of results > 100), the script is run. The notification is sent based on the script. The script must contain the instructions to send the notification.
For more information, see Creating notifications.
This feature allows you to select multiple saved searches and define a condition for each saved search.
This condition defines the number of results based on which the alert is triggered.
Furthermore, you can define a high-level condition (operation) by which all the conditions (AND operation) or either of the conditions (OR operation) must be used to decide whether the alert must be triggered.
For more information, see Creating notifications.
As an Administrator, CLI commands can help you perform maintenance and operational tasks quickly and easily, without using the product Console.
The following table describes the new commands available:
|Command||Can be used to...|
Associate one or more collection profiles with a host.
Enables you to automatically create data collectors based on the templates included in the collection profiles.
Move a Collection Station in or out of the ITDAPool.
Enables you to control and administer the Collection Stations to be included or excluded from the pool while applying data collection redundancy.
Move objects created from one user to another user with the same (or higher) role.
Enables you to easily transfer ownership of various objects to the user expected to replace the user to be deleted.
Configure the BMC Atrium Single-Sign On server after an upgrade; also helps you move users (along with associated user groups) existing in IT Data Analytics to Atrium Single Sign-On.
If you find that some users are not moved successfully, then you can use the transferownership command to move objects owned by that user to another user in Atrium Single Sign-On.
Enables you to move to Atrium Single Sign-On as your authentication mechanism after an upgrade.
This version provides various UI enhancements that are aimed at:
These enhancements are listed as follows:
The Receive over TCP/UDP (Syslog etc) data collector is renamed to Receive over TCP/UDP. Also, by default, the data pattern for this data collector is set to Free Text without Timestamp.
EASE OF USE