Note

 

This documentation supports the 20.19.01 version of Remedyforce.

To view the latest or an earlier version, select the version from the Product version menu.

Importing data from Dell KACE


Dell KACE is a provider of systems management appliances. You can use the appliance-based approach to manage desktops, laptops, and servers. For more information, see https://software.dell.com/kace/. For information about enabling staff members to work with the Dell KACE Console, see Integrating BMC Helix Remedyforce with Dell KACE.

You can import data from Dell KACE into BMC Remedyforce CMDB 2.0 by using the Pentaho Data Integration tool. The Pentaho package for importing data from Dell KACE is available on the BMC Communities website. For information about how configuration items (CIs) are imported and relationships are created in CMDB 2.0, see Overview of how data is imported into BMC Remedyforce.

The following topics provide information about importing data from Dell KACE into BMC Remedyforce CMDB:

Note

For information about the Dell KACE versions that BMC Remedyforce supports, see Supported browsers, mobile devices, and integrations.


Before you begin

Before you can import data from Dell KACE into BMC Remedyforce CMDB, you must perform the following tasks:

To import data from Dell KACE


  1. To launch the Pentaho Data Integration tool, perform the following actions:
    1. Navigate to the location where you downloaded and unzipped the Pentaho Data Integration tool.

    2. Navigate to the data-integration folder and double-click the Spoon.bat file.

  2. In Pentaho Spoon, select File > Open, navigate to the folder where you downloaded the Pentaho package, and open the appropriate KJB file.
    For example, to import LAN Endpoint CIs into BMC Remedyforce CMDB, open the TransferDELLKACELANEndPointInfotoCMDB.kjb file in the LANEndpoints folder.
  3. In the KJB file, right-click the Update BE table with Dell KACE <CI type name> information step, select Open referenced object, and then select Transformation.

  4. Enter the information required to access the Dell KACE database.

    1. In the KTR file, double-click the Dell KACE Input step.
    2. Next to the Connection list, click Edit.
    3. In the Database Connection window, enter the host name, database name, user name, and password.
      The host name is the name of the computer where the Dell KACE database is installed. Ensure that MYSQL database is selected as the Connection Type and Native (JDBC) is selected as Access value.

    4. (Optional) To verify the connection, click Test.
    5. To save your changes and close the window, click OK.
  5. Enter the information required to access your Salesforce organization.

    1. Double-click the Upsert into class <CI type name> step.
    2. In the Salesforce Upsert window, enter your Salesforce organization user name and password.
      BMC recommends that you do not change the default API version in the Salesforce Webservice URL.
    3. (Optional) To test the connection, click Test connection and then click OK.
    4. To save your changes and close the window, click OK.
    5. Double-click the Salesforce Input[CMDB_Class] step and in the Salesforce Input window, repeat step 5b to step 5d.

    Important

    If you have enabled the setting to access your Salesforce organization from limited IP addresses, you must append the security token to your Salesforce organization password.

    For example, if the password for your Salesforce organization is mypassword and your security token is XXXXXXXXX, specify mypasswordXXXXXXXXX in the Password fields.

  6. (Optional) Update the predefined mapping between Dell KACE fields and BMC Remedyforce CMDB.

    1. Double-click the Upsert into class <CI type name> step.
    2. In the Salesforce Upsert window, click Edit Mapping.
    3. In the Enter Mapping window, update mappings for fields based on your requirements.
      For information about updating the out-of-the-box mapping, see Field mapping in CMDB 2.0..
    4. To save your settings and close the Enter Mapping window, click OK.

    5. To save your changes and close the Salesforce Upsert window, click OK.
  7. To save the KTR and KJB files, click Save .
  8. In the KJB file tab, click Run this job .
  9. Perform one of the following actions based on the Pentaho version that you are using:

    Pentaho versionAction
    6.1In the Run Options window, click Run.
    5.4In the Execute a job window, click Launch.

    Data is imported into the appropriate CMDB class.

    Transformation status is depicted by using the following icons:

    • — Complete
    •   — Running
    • — Unsuccessful
  10. Verify whether the entries are correct in the Relationships Editor of the appropriate CMDB class.
  11. (Optional) To view logs, in the Execution results section, click the Logging tab.
    All errors are displayed in red.
    For the imported records, the Source field is set to KACE. You can use the Source field to generate reports and also when you enable and configure reconciliation. 

    Warning

    If you are using two different computers, two records are imported from the Person package in CMDB for the same user.

    You can launch the Dell KACE console to view the details of the records that have been imported from Dell KACE.


To schedule jobs to import data

On the BMC Communities website, the Pentaho packages for importing data from external data sources into BMC Remedyforce page contain a Scheduling Files folder. This folder contains a sample batch file that you can use as a base to create your batch files for scheduling data import. For more information about scheduling a job, see http://wiki.Pentaho.com/display/EAI/Kitchen+User+Documentation.

  1. Navigate to the folder where you have extracted the Pentaho package for which you want to schedule import jobs.
  2. Open the Scheduling Files folder and create a copy of the sample batch file, such as SchedulingComputerSystem.bat.
  3. Go to Start > Control Panel > Administrative Tools > TaskScheduler.
  4. In the Task Scheduler window, in the Actions area, click Create Basic Task.
  5. In the Create Basic Task Wizard, type the name and description of the task, and click Next.
  6. Select the appropriate option for starting the task, and click Next.
    For example, you can select Daily or When the computer starts as the option for starting the task.
  7. Configure the additional options for starting the task, and click Next.
    The additional options are displayed based on the option that you selected in step 7. For example, if you select the Daily option for starting the task, you must configure the start date and frequency at which you want to run the batch file.
  8. Select the Start a program radio button, and click Next.
  9. Click Browse to locate the batch file that you have created to schedule the job, and click Next.
  10. Select the Open the Properties dialog for this task when I click Finish check box, and click Finish.

User scenarios for importing data from Dell KACE


David is a member of the Infrastructure team for Downtown Bank, responsible for managing the laptops for ABC project.He signs up for BMC Remedyforce and expects to use the pre-configured, ITIL based incident and problem management processes for managing CIs. Also, David wants to use the CMDB to represent the physical, logical, and conceptual items and to track the relationships between the different CIs.

 

Data to be importedSteps to follow
  • Operating System
  • Processor
  • Person
  • LAN Endpoint
  1. Run the \CombinedJobs\TransferDELLKACEinfotoCMDB.kjb file.
  2. Go to Remedyforce CMDB and verify whether the entries are correct in
    the Relationships Editor of the following classes:
    • Operating System
    • Processor
    • Person
    • LAN Endpoint
Operating system details of the laptops
  1. Run the \ComputerSystem\TransferDELLKACE
    ComputerSystemInfotoCMDB.kjb
     file.
  2. Run the \OS\TransferDELLKACEOSInfotoCMDB.kjb file.
  3. Go to Remedyforce CMDB and open the Operating System class.
  4. Verify whether the entries are correct in the Relationships section
    in the Instance Editor of the Operating System class.
Processor details of the laptops
  1. Run the \ComputerSystem\TransferDELLKACE
    ComputerSystemInfotoCMDB.kjb
     file.
  2. Run the \Processor\TransferDELLKACEProcessor
    InfotoCMDB.kjb
     file.
  3. Go to Remedyforce CMDB and open the Processor class.
  4. Verify whether the entries are correct in the Relationships
    Editor of the Processor class.
Software server details of the laptops
  1. Run the \CombinedJobs\TransferDELLKACEComputerSystem
    AndSoftwareServerinfotoCMDB.kjb
    file.
  2. Go to Remedyforce CMDB and open the Software Server class.
  3. Verify whether the entries are correct in the Relationships Editor
    of the Software Server class.

KJB and KTR files for importing data from Dell KACE


The Pentaho packages contain job (KJB) and transformation (KTR) files that are created in the Pentaho Data Integration tool. The KJB files (jobs) and KTR files (transformations) store metadata in XML format. A KJB file contains a series of transformations that are run in a sequence. A KTR file contains a single transformation. The KJB (job) files retrieve data from external sources into BMC Remedyforce CMDB.

For information about the KJB and KTR files in the Pentaho packages for Dell KACE , see KJB files in the Pentaho package and KTR files in the Pentaho package.

KJB files in the Pentaho package

The Pentaho package includes different KJB (job) files for importing different types of CIs, such as computer systems and LAN endpoints, from Dell KACE. These job files are bundled with related files and provided to you in folders. The folder names correspond to CI types in BMC Remedyforce CMDB.

The following table provides information about the job files that you must run to import specific CI types to BMC Remedyforce CMDB:

Folder name (CI type)

Job file

Description

Computer System

TransferDELLKACEComputer
SystemInfotoCMDB.kjb

Transfers information about the
BMC_ComputerSystem CI type

LANEndPoints

TransferDELLKACELANEnd
PointInfotoCMDB.kjb

Transfers information about the
BMC_LANEndpoint CI type

OS

TransferDELLKACEOS
InfotoCMDB.kjb

Transfers information about the
BMC_OperatingSystem CI type

Person

TransferDELLKACEPerson
InfotoCMDB.kjb

Transfers information about the
BMC_Person CI type

Processor

TransferDELLKACEProcessor
InfotoCMDB.kjb

Transfers information about the
BMC_Processor CI type

Software Server

TransferDELLKACESoftwareServer
InfotoCMDB.kjb

Transfers information about the
BMC_SoftwareServer CI type

Note: Depending on the organization size, running the software server job transfers large amounts of data to the Salesforce organization, which might lead to exceeding the organization storage limit.

Combined Jobs

  • TransferDELLKACEinfotoCMDB.kjb
  • TransferDELLKACEComputerSystem
    AndSoftwareServerinfotoCMDB.kjb
  • TransferDELLKACEinfotoCMDB.kjb
    This job file transfers information related to the BMC_Computer
    System, BMC_OperatingSystem, BMC_Processor, BMC_Person, and
    BMC_LANEndpoint CI types.
  • TransferDELLKACEComputerSystem
    AndSoftware
    ServerinfotoCMDB.kjb
    This job file transfers information related to the BMC_ComputerSystem and BMC_SoftwareServer CI types.

KTR files in the Pentaho package

All job (KJB) files for importing data from specific CI types contain a series of transformations that are run in a sequence. Each transformation maps to a KTR file that is available, along with the KJB file, in the folder for each CI type.

The following table provides information about the KTR files and the corresponding transformations that the KJB file for each CI type contains:

KTR file

Transformation or step in the KJB file

Description

None

Check If Delta Timestamp File Exists

Checks if any time stamp file exists.

The Pentaho package utilizes a time stamp file to determine which records were added or modified since the last time the job was run.

This step is used for incremental import. If a time stamp file does not exist, the Create the initial time stamp file step is run; otherwise, the Update BE table with Dell KACE <CI type name> Information step.

CreateInitialTimeStampedFile.ktr

Create the initial time stamp file

Creates the time stamp file to record the time of import.

This step is run only if you are importing for the first time or deleted the existing time stamp file.

TransferDELLKACE<CI Type>InfotoCMDB.ktr

Update BE table with Dell KACE <CI type name> Information

Transfers data from Dell KACE to the Base Element object of the BMC Remedyforce CMDB.

Store<CI Type>InfoTimestamp.ktr

Store the current time stamp

If the importing of data is successful, the time stamp file is updated with the current time of import.

For example, the ComputerSystem job file (TransferDELLKACEComputerSystemInfotoCMDB.kjb) contains the CreateInitialTimeStampedFile.ktr, TransferDELLKACEComputerSystemInfotoCMDB.ktr, and StoreComputerSystemInfoTimestamp.ktr files.

The following table provides information about the steps that are included in the TransferDELLKACE<CIType>infotoCMDB.ktr file (Update BE Table With DELL KACE <CI type name> information transformation). The Pentaho package runs these steps to transfer data from Dell KACE into BMC Remedyforce CMDB. You can view these steps only when you open the KTR file in the Pentaho Data Integration tool.

Step

Description

Delta timestamp

Reads the saved time stamp.

Dell KACE input

An SQL query that fetches information from Dell KACE based on stored time stamp.

Transform Dell KACE variables

Facilitates JavaScript transformations, such as appending or trimming fields.

Update BE table with Dell KACE_<CI type name>_ information

Accepts the destination of the data, the credentials for the Salesforce organization where you want to save the imported data.

Salesforce Input[CMDB_Class]

Accepts the access details of the Salesforce organization to fetch the BMC Remedyforce CMDB classes details.

Success rows

Stores the rows that are imported successfully.

Failure rows

Stores the rows that are not imported, error code, error description, and error fields.


Troubleshooting

The following table describes the troubleshooting tips that you can use to resolve common issues that you might face when importing data.

 Error or issueApplies to the Pentaho package forDescription or procedure
Viewing logs of the import jobsAll

Log files are created in the folder where you have saved the KJB files. Success and failure rows files are also created in the same folder.

If a failure occurs, the error code and its description are provided in the failure row file. You can also use the failure row file to import data to Salesforce.

Not all records are importedAllDelete the delta time stamp file of the job that you ran, and run the job again.
Import failsAllIf you have upgraded to BMC Remedyforce Winter 17 (20.17.01), either you use the latest Pentaho packages updated on the BMC Communities or map a value to the Source field of the Base Element object. Also, ensure that you have Edit permission on the Source field of the Base Element object.
An "out of memory" error occursAll

While importing a large number of records, if you get the OutofMemoryError or Java heap size error message, increase the heap size in the Spoon.bat file.

  1. Navigate to the location where you downloaded and unzipped the Pentaho Data Integration tool.
  2. Navigate to the data-integration folder.
  3. Right-click the Spoon.bat file, and select Edit.
  4. Locate the following line and replace 512 with a higher value:
    If "%PENTAHO_DI_JAVA_OPTIONS%"==""
    set PENTAHO_DI_JAVA_OPTIONS=
    "-Xmx512m" "-XX:MaxPermSize=256m"
  5. Click Save.
  6. Relaunch the Spoon.bat file and rerun the job file.
A "too many script statements" error occursAllWhen you run an import, if you receive "Too many script statements" as an onscreen Apex error message or in an email, reduce the batch size by 10 in the Batch Size field in the Settings section on the Salesforce Upsert window.
Records are not being to your CMDB.AllRaise a case with Salesforce to create a custom index on the Assembly ID field of the Base Element object.
Importing users by using the failure rows file for LDAPLDAP server

A failure row file is a text file that is saved in the folder where you have saved your KJB files. Perform the necessary steps to remove the error provided for the failure rows in the failure rows file, and then import the data by using the following steps.

The following Pentaho packages provide a transformation file in the FailureRowsInput folder. You can use this transformation file for importing users from the failure rows file to the Salesforce organization.

To import data from the failure rows file

  1. Double-click and open the FailureRowsInput folder.
  2. Double-click and open the LDAP_FAILURE_ROWS_
    <YYYYMMDDHHMMSS>
    file.
    Where YYYYMMDDHHMMSS is the time stamp of the file.
    The first line in the file displays the header row and from the second line onward details of the failed import are displayed. 
  3. For each failed import record:
    1. Read the information under the headings Error Description, Error Fields, and Error Codes.
    2. Resolve the error.
  4. Open the FailureLDAP transformation file with the Spoon batch file of the Pentaho Data Integration tool.
  5. Double-click the Salesforce Upsert step, and enter your Salesforce organization username and password in the Connection section.
  6. (Optional) To verify the connection, click Test connection.
  7. Click OK.
  8. To save the FailureLDAP transformation file, click .
  9. Click .
  10. In the Execute a transformation window, click Launch.
Failed in writeToSalesForce: java.lang.IllegalArgumentException: The char '0x0' after 'Print' is not a valid XML characterAllSome unicode characters that cannot be parsed by the XML parser are present in any of the mapped fields. Either delete these characters in data or delete the mapping of such fields.
Unable to query SalesforceAllCheck your Salesforce organization credentials in the Salesforce Upsert and Salesforce Input(CMDB_Classes) steps.
The job does not appear in Atrium Integrator.BMC Atrium CMDBEnsure that you have saved your KTR and KJB files in a folder in Atrium Integrator.
Error setting value #2 [ARDelta_1 Integer] on prepared statement (Integer)BMC Atrium CMDBThis error is generated if multiple records are created in the BMC Remedy AR System NGIE:Delta form for the running transformation. Delete the additional ARDelta entries that are created for the error transformation file by deleting the records. To find the duplicate records for a transformation file, open the NGIE:Delta form in BMC Remedy AR System. In the TransName field, enter the transformation file name, and click Search.
Did not find Remedy Application Service password for server<server name> in UDM:RAppPassword Form on server <server name>BMC Atrium CMDBCheck your Atrium Server Connection credentials in the ARInput step.
'Oracle Database Server 10g Release 2' is not valid for the type xsd: <data type >BMC Atrium CMDBCheck the data type of the mapped fields in the Salesforce Upsert step.
Duplicate LAN Endpoint entries are created in BMC Remedyforce CMDB.BMC Client Management

In BMC Remedyforce Summer 15 Patch 1 and earlier versions, when you imported LAN Endpoint data from BMC Client Management, duplicate LAN endpoint entries might have been created in BMC Remedyforce CMDB. This duplicate data was created because of the following factors:

  • In BMC Client Management, a LAN endpoint could periodically obtain multiple IP addresses using DHCP.
  • In BMC Remedyforce Summer 15 and earlier, the Network Interface IP address was used as the unique source identifier for LAN endpoints imported from BMC Client Management.

In BMC Remedyforce Summer 15 Patch 1 and later versions, instead of the Network Interface IP address, the MAC address is used as the unique source identifier for the LAN endpoints imported from BMC Client Management, and new duplicate LAN endpoint entries are not created in BMC Remedyforce CMDB.

To resolve this issue, you must install the latest Pentaho package for BMC Client Management released with BMC Remedyforce Summer 15 Patch 1 or later. Also, you must manually delete any existing duplicate LAN endpoint entry on the Remedyforce CMDB. For more information, see Deleting configuration items.

The HTTP Status 404 error is displayed when trying to connect to Salesforce.All

Ensure that your Pentaho transformations connect to https://login.salesforce.com/services/Soap/u/<API version> instead of https://www.salesforce.com/services/Soap/u/<API version>.

Starting from January 1, 2016, Salesforce retired www.salesforce.com as an API endpoint. For more information, see the announcement on BMC Communities website. To view a video demonstration of how to update your Pentaho transformations, see Salesforce API Endpoint Retirement.

BMC Remedyforce has updated the API endpoint in the Pentaho packages that are currently available on the BMC Communities website.

Error connecting to your Salesforce organization when using version 5.4 of the Pentaho Data Integration Tool.All

Check whether the Require TLS 1.1 or higher for HTTPS connections Salesforce critical update is enabled in your organization.

Pentaho Data Integration Tool 5.4 does not support TLS 1.1 and cannot connect to your Salesforce organization if this Salesforce critical update is enabled in the organization.

To resolve this issue, perform one of the following actions:

  • Use version 6.1 of the Pentaho Data Integration Tool to import data from various data sources to BMC Remedyforce.
  • Deactivate the Require TLS 1.1 or higher for HTTPS connections Salesforce critical update in your Salesforce organization.
    For information about when the critical update will be automatically activated, see Salesforce Knowledge Article Number 000232871.

Error running the Pentaho packages.

In some cases, you might also not be able to open the Enter Mapping window by clicking Edit Mapping in the Salesforce Upsert window.

All

In the Pentaho packages provided by BMC Remedyforce, ensure that the API version in the Salesforce Webservice URL field in steps, such as Salesforce Upsert, is supported.

  • In the LDAP Pentaho packages, the default API version in the Salesforce Webservice URL field is 27 (https://login.salesforce.com/services/Soap/u/27).
  • In all other Pentaho packages, the default API version in the Salesforce Webservice URL field is 24 (https://login.salesforce.com/services/Soap/u/24).

In the Salesforce Upsert window, when you click Edit Mapping, the following error message is displayed:

Certain referenced fields were not found!

All

Perform the following steps to resolve this issue:

  1. In the Salesforce Upsert window, click Edit Mapping.
    The Certain referenced fields were not found! error message is displayed.
  2. To open the Enter Mapping window, click OK.
    The tool removes all referenced fields from the existing mappings. The Pentaho package provided by BMC Remedyforce includes only one referenced field, BMCServiceDesk__CMDB_Class__c. The CMDB_Class field is no longer available in the Mappings column.
  3. In the Source fields column, select CMDB_Class and click Add.
    When you move CMDB_Class from the Source field column to the Mappings column, the c in the field name is replaced with r. For example, BMCServiceDesk__CMDB_Class__c is replaced with BMCServiceDesk__CMDB_Class__r.
  4. (Optional) Update mappings for other fields based on your requirements.
    For information about updating the out-of-the-box mapping, see Field mapping in CMDB 2.0.
  5. To save your settings and close the Enter Mapping window, click OK.
  6. In the Salesforce Upsert window, in the Module field column in the Fields area, click BMCServiceDesk__CMDB_Class__r, and replace r with c.

  7. To save your changes and close the Salesforce Upsert window, click OK.

Related topics

Applying models while importing CIs and assets

Scheduling jobs to import data

Troubleshooting common issues when importing data

Starting the Dell KACE console

Known and corrected issues for Pentaho packages

Was this page helpful? Yes No Submitting... Thank you

Comments