Adding and managing data movers

This topic contains the following sections: 

Overview of Data Mover

A Data Mover is a task responsible for moving data from a source that - in general - does not comply with TrueSight Capacity Optimization standard Importing data from custom sources through data formats. Moved data are not collected into TrueSight Capacity Optimization's Data Warehouse, but saved on the filesystem in CSV or XML format.

For example, by using a Data Mover you can extract any type of data from a database and:

  • Move it to files stored in the /repository folder.
  • Move it to external locations (data feeds), for example via FTP or using Windows shares.

Examples of data collected by a Data Mover include:

  • Results from a custom query, whose number of columns might vary with time.
  • Textual information, such as system diagnostics.

You can add a new Data Mover by navigating to Administration > ETL & SYSTEM TASKS and then selecting the Add data mover option on the Add drop-down menu found at top of the ETL tasks page. You can then configure it by manually adding properties.

Overview of Query Data Mover

A Query Data Mover is a type of Data Mover task used to:

  • Open a connection to a specified database.
  • Launch one or more queries.
  • Save the results into XML files (one file for each query).

To configure the database connection

By default, a connection to TrueSight Capacity Optimization's internal database is opened, because the primary purpose of a Data Mover is to automatically export data from TrueSight Capacity Optimization to an external location.

To connect to other data sources, modify the following properties:

datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
datasource.url=jdbcqlserver://10.XXX.XXX.XXX:2434;databaseName=MYDBNAME;
datasource.password=****
datasource.username=USER

Additional information

If the database you are connecting to is not Oracle, the JDBC driver .jar archive file must be copied in the /lib directory of the scheduler installation directory. Also, the format of the datasource.url parameter heavily depends on the JDBC driver. For more information, refer to the driver's official documentation.

To configure queries

Each query is defined by a set of three properties:

  • <queryname>.query: SQL code of the query.
  • <queryname>.description: Description of the query.
  • <queryname>.filename: Name of the file in which results will be saved.

All <querynames> must be specified as a semicolon-separated list in the query.name.list property.

All files will be created in the base directory specified by the output.directory property. File names can contain macros, for example, %{DATE(yyyy-MM-dd)}.  The date macro must be in Java format string and it works in both file output and zip output fields. For a list of all macros that can be used in the ETL configuration, see ETL configuration macros.

Configuration example:

datamover.type=query
datasource.driverClassName=oracle.jdbc.driver.OracleDriver
datasource.password=*******
datasource.url=jdbc:oracle:thin:@mydatabase.domain.it:1526:SID
datasource.username=cpit2_app
format.type=xml
columnnames.uppercase=true
mask.ignorevalidation=false
zip.enabled=false
output.directory=$CPITBASE/repository/xdata/datamover_files

query.name.list=test_check_01;test_check_02
test_check_01.description=TEST - query 1
test_check_01.filename=test_check_01_%{DATE(yyyy-MM-dd)}.xml
test_check_01.query=SELECT trunc(sysdate) as ts FROM dual
test_check_02.description=TEST - query 2
test_check_02.filename=test_check_02_%{DATE(yyyy-MM-dd)}.xml
test_check_02.query=SELECT count(*) as task_number from task

The format.type property can be set to either xml or csv.

To send output files to external locations

After the execution of a Data Mover, you can choose to upload the resulting files to external locations via FTP, CIFS (Windows shares), SCP, SFTP or WEB_DAV.

Additional information

Configuring all protocols will cause the upload to be executed multiple times.

 The following code is an example of how you can configure the upload using different services:

post.action.number=1
post.1.action.type = UPLOAD
post.1.uploader.number.files = 2
post.1.uploader.file.1.location = /opt/cpit/repository/test.xml
post.1.uploader.file.2.location = /opt/cpit/repository/file.dbd

//FTP
post.1.uploader.ftp.username = user
post.1.uploader.ftp.password = pass
post.1.uploader.ftp.host = host
post.1.uploader.ftp.remote.folder = [optional] NEP_SHARE/

//CIFS
post.1.uploader.cifs.username = user
post.1.uploader.cifs.password = pass
post.1.uploader.cifs.workgroup = WORKGROUP
post.1.uploader.cifs.remote.folder = storage/
post.1.uploader.cifs.host = nepsrv22

//SCP
post.1.uploader.scp.username = user
post.1.uploader.scp.password = pass
post.1.uploader.scp.remote.folder = /root/TEST
post.1.uploader.scp.host = 10.10.10.13

//WEB DAV
post.1.uploader.scp.username = user
post.1.uploader.scp.password = pass
post.1.uploader.scp.remote.folder = webdav/
post.1.uploader.scp.host = 10.10.10.13

//SFTP
post.1.uploader.sftp.username = user
post.1.uploader.sftp.password = pass
post.1.uploader.sftp.remote.folder = /home/cpit
post.1.uploader.sftp.host = 10.10.10.13

Was this page helpful? Yes No Submitting... Thank you

Comments

  1. Edward Miall

    You should say that the date "macro" here needs java format string... Also that is works in both file output and zip output fields... Also you should list any other available macros...

    Oct 19, 2020 03:37
    1. Bharati Poddar

      Hello Edward,

      Thanks for the input. I will update the topic.

      Regards, Bharati

      Dec 08, 2020 11:44
    1. Manisha Moon

      Hello Edward,

      Thank you for the feedback.

      I have added the required content and a link to the list of macros. 

      Regards,

      Manisha

      Feb 28, 2022 02:52