Importing data from Dell KACE
Note
The Pentaho package to import data from DELL KACE is deprecated and will no longer be available after the BMC Helix Remedyforce Winter 20 release. However, the DELL KACE CMDB 2.0 package is supported. For more information about the package, see
Remedyforce Pentaho Files for Dell KACE Integration
.
Dell KACE is a provider of systems management appliances. You can use the appliance-based approach to manage desktops, laptops, and servers. For more information, see https://software.dell.com/kace/. For information about enabling staff members to work with the Dell KACE Console, see Integrating BMC Helix Remedyforce with Dell KACE.
You can import data from Dell KACE into BMC Remedyforce CMDB 2.0 by using the Pentaho Data Integration tool. The Pentaho package for importing data from Dell KACE is available on the BMC Communities website. For information about how configuration items (CIs) are imported and relationships are created in CMDB 2.0, see Overview of how data is imported into BMC Remedyforce.
The following topics provide information about importing data from Dell KACE into BMC Remedyforce CMDB:
Note
For information about the Dell KACE versions that BMC Remedyforce supports, see Supported browsers, mobile devices, and integrations.
Before you begin
Before you can import data from Dell KACE into BMC Remedyforce CMDB, you must perform the following tasks:
- Install Java Development Kit (JDK).
- Download the Pentaho Data Integration tool.
If the Require TLS 1.1 or higher for HTTPS connections Salesforce critical update is enabled in your organization, you must use version 6.1 or later of the Pentaho Data Integration tool. - Register at the BMC Communities website.
- Add the MySQL connector file to the \data-integration\lib folder.
- (Optional) Configure BMC Remedyforce to retain the original Instance Name value of CIs imported from an external source.
- If you have enabled the setting to access your Salesforce organization from limited IP addresses, ensure that you have the security token to log on to your Salesforce organization.
For more information about the security token, see Salesforce Help. Ensure that the Enable database access option under Settings > Control Panel tab > Security Settings > General Security Settings section in the Dell KACE Management Centre (Dell KACE console) is selected.
Note
The database name is present in Settings > Control Panel tab > General Settings > Update Reporting User Password section in the Dell KACE Management Centre (Dell KACE console).
Download the Pentaho package from the BMC Communities website ( https://community.bmc.com/s/news/aA33n000000TPB7CAO/integrating-bmc-helix-remedyforce-with-dell-kace).
For information about the KJB (job) and KTR (transformation) files that are included in the Pentaho package, see KJB and KTR files for importing data from Dell KACE.Best practice
We recommend that you do not change the folder structure because doing so can interfere with the running of the batch files provided in the CombinedJobs folder.
To import data from Dell KACE
- To launch the Pentaho Data Integration tool, perform the following actions:
Navigate to the location where you downloaded and unzipped the Pentaho Data Integration tool.
Navigate to the data-integration folder and double-click the Spoon.bat file.
- In Pentaho Spoon, select File > Open, navigate to the folder where you downloaded the Pentaho package, and open the appropriate KJB file.
- To import data into CMDB classes, open the file TransferDELLKACEinfotoCMDB.kjb or TransferDELLKACEComputerSystemAndSoftwareServerinfotoCMDB.kjb in the CombinedJobs folder.
In the Combined Job Properties window, perform the following actions:
- Right-click on the screen.
- Select Properties.
- Click the Parameters tab.
- Enter the values for the required parameters.
Note
After you enter the credentials for the combined job (top-most parent), the credentials are inherited by all the subsequent jobs or transformations.
Important
If you have enabled the setting to access your Salesforce organization from limited IP addresses, you must append the security token to your Salesforce organization password.
For example, if the password for your Salesforce organization is mypassword and your security token is XXXXXXXXX, specify mypasswordXXXXXXXXX in the Password fields.
(Optional) Update the predefined mapping between Dell KACE fields and BMC Remedyforce CMDB.
- To save the KTR and KJB files, click Save .
- In the KJB file tab, click Run this job .
Perform one of the following actions based on the Pentaho version that you are using:
Pentaho version Action 6.1 In the Run Options window, click Run. 5.4 In the Execute a job window, click Launch. Data is imported into the appropriate CMDB class.
Transformation status is depicted by using the following icons:- — Complete
- — Running
- — Unsuccessful
- Verify whether the entries are correct in the Relationships Editor of the appropriate CMDB class.
(Optional) To view logs, in the Execution results section, click the Logging tab.
All errors are displayed in red.
For the imported records, the Source field is set to KACE. You can use the Source field to generate reports and also when you enable and configure reconciliation.Warning
If you are using two different computers, two records are imported from the Person package in CMDB for the same user.
You can launch the Dell KACE console to view the details of the records that have been imported from Dell KACE.
To schedule jobs to import data
On the BMC Communities website, the Pentaho packages for importing data from external data sources into BMC Remedyforce page contain a SchedulingFiles folder. This folder contains a sample batch file that you can use as a base to create your batch files for scheduling data import. For more information about scheduling a job, see http://wiki.Pentaho.com/display/EAI/Kitchen+User+Documentation.
Note
To understand the job scheduling process, refer to the following procedure. This procedure is an example to be used as a reference for all other packages.
To schedule the Remedyforce LDAP Pentaho package to import users, perform the following steps:
(For all other required packages, perform these steps by using the respective batch files.)
- Navigate to the Remedyforce Package folder.
- Rename your extracted package folder to SchedulingFiles.
- Open the folder where you have installed the Pentaho Data Integration tool and minimize this folder.
- Open the SchedulingFiles folder.
The folder contains two .bat files. - To create a copy of the batch file, copy SchedulingTransferLDAPInfo.bat file and paste it in the same folder.
Right-click the SchedulingTransferLDAPInfo.bat file and click Edit.
The SchedulingTransferLDAPInfo.bat notepad opens.- Minimize the notepad and open the Pentaho data-integration folder as mentioned in Step 3.
Copy the data-integration folder address and navigate to the notepad as mentioned in Step 6.
The following figure shows an example of the updated SchedulingTransferLDAPInfo.bat file.
- Before the first line of the notepad, type cd and paste the copied data-integration folder address . (Make sure to add a space betweeen cd and the folder address).
- Paste the data-integration folder address in the Kettle Home folder address and remove the double quotes ("").
- Paste the data-integration folder address in the kitchen.bat file address by replacing only the part before
\
kitchen.bat
. - Navigate to the Remedyforce Package folder and copy the address for TransferLDAPInfo.kjb.
- Paste the TransferLDAPInfo.kjb address in between
\
kitchen.bat /file:
and/level:Detailed.
Remove the double quotes ("") and any additional spaces between the address. - Save the notepad.
User scenarios for importing data from Dell KACE
David is a member of the Infrastructure team for Downtown Bank, responsible for managing the laptops for ABC project.He signs up for BMC Remedyforce and expects to use the pre-configured, ITIL based incident and problem management processes for managing CIs. Also, David wants to use the CMDB to represent the physical, logical, and conceptual items and to track the relationships between the different CIs.
Data to be imported | Steps to follow |
---|---|
|
|
Operating system details of the laptops |
|
Processor details of the laptops |
|
Software server details of the laptops |
|
KJB and KTR files for importing data from Dell KACE
The Pentaho packages contain job (KJB) and transformation (KTR) files that are created in the Pentaho Data Integration tool. The KJB files (jobs) and KTR files (transformations) store metadata in XML format. A KJB file contains a series of transformations that are run in a sequence. A KTR file contains a single transformation. The KJB (job) files retrieve data from external sources into BMC Remedyforce CMDB.
For information about the KJB and KTR files in the Pentaho packages for Dell KACE , see KJB files in the Pentaho package and KTR files in the Pentaho package.
KJB files in the Pentaho package
The Pentaho package includes different KJB (job) files for importing different types of CIs, such as computer systems and LAN endpoints, from Dell KACE. These job files are bundled with related files and provided to you in folders. The folder names correspond to CI types in BMC Remedyforce CMDB.
The following table provides information about the job files that you must run to import specific CI types to BMC Remedyforce CMDB:
Folder name (CI type) | Job file | Description |
---|---|---|
Computer System | TransferDELLKACEComputer | Transfers information about the |
LANEndPoints | TransferDELLKACELANEnd | Transfers information about the |
OS | TransferDELLKACEOS | Transfers information about the |
Person | TransferDELLKACEPerson | Transfers information about the |
Processor | TransferDELLKACEProcessor | Transfers information about the |
Software Server | TransferDELLKACESoftwareServer | Transfers information about the Note: Depending on the organization size, running the software server job transfers large amounts of data to the Salesforce organization, which might lead to exceeding the organization storage limit. |
Combined Jobs |
|
|
KTR files in the Pentaho package
All job (KJB) files for importing data from specific CI types contain a series of transformations that are run in a sequence. Each transformation maps to a KTR file that is available, along with the KJB file, in the folder for each CI type.
The following table provides information about the KTR files and the corresponding transformations that the KJB file for each CI type contains:
KTR file | Transformation or step in the KJB file | Description |
---|---|---|
None | Check If Delta Timestamp File Exists | Checks if any time stamp file exists. The Pentaho package utilizes a time stamp file to determine which records were added or modified since the last time the job was run. This step is used for incremental import. If a time stamp file does not exist, the Create the initial time stamp file step is run; otherwise, the Update BE table with Dell KACE <CI type name> Information step. |
CreateInitialTimeStampedFile.ktr | Create the initial time stamp file | Creates the time stamp file to record the time of import. This step is run only if you are importing for the first time or deleted the existing time stamp file. |
TransferDELLKACE<CI Type>InfotoCMDB.ktr | Update BE table with Dell KACE <CI type name> Information | Transfers data from Dell KACE to the |
Store<CI Type>InfoTimestamp.ktr | Store the current time stamp | If the importing of data is successful, the time stamp file is updated with the current time of import. |
For example, the ComputerSystem job file (TransferDELLKACEComputerSystemInfotoCMDB.kjb) contains the CreateInitialTimeStampedFile.ktr, TransferDELLKACEComputerSystemInfotoCMDB.ktr, and StoreComputerSystemInfoTimestamp.ktr files.
The following table provides information about the steps that are included in the TransferDELLKACE<CIType>infotoCMDB.ktr file (Update BE Table With DELL KACE <CI type name> information transformation). The Pentaho package runs these steps to transfer data from Dell KACE into BMC Remedyforce CMDB. You can view these steps only when you open the KTR file in the Pentaho Data Integration tool.
Step | Description |
---|---|
Delta timestamp | Reads the saved time stamp. |
Dell KACE input | An SQL query that fetches information from Dell KACE based on stored time stamp. |
Transform Dell KACE variables | Facilitates JavaScript transformations, such as appending or trimming fields. |
Update BE table with Dell KACE_<CI type name>_ information | Accepts the destination of the data, the for the Salesforce organization where you want to save the imported data.credentials |
Salesforce Input[CMDB_Class] | Accepts the access details of the Salesforce organization to fetch the BMC Remedyforce CMDB classes details. |
Success rows | Stores the rows that are imported successfully. |
Failure rows | Stores the rows that are not imported, error code, error description, and error fields. |
Troubleshooting
The following table describes the troubleshooting tips that you can use to resolve common issues that you might face when importing data.
Error or issue | Applies to the Pentaho package for | Description or procedure |
---|---|---|
Viewing logs of the import jobs | All | Log files are created in the folder where you have saved the KJB files. Success and failure rows files are also created in the same folder. If a failure occurs, the error code and its description are provided in the failure row file. You can also use the failure row file to import data to Salesforce. |
Not all records are imported | All | Delete the delta time stamp file of the job that you ran, and run the job again. |
Import fails | All | If you have upgraded to BMC Remedyforce Winter 17 (20.17.01), either you use the latest Pentaho packages updated on the BMC Communities or map a value to the Source field of the Base Element object. Also, ensure that you have Edit permission on the Source field of the Base Element object. |
An "out of memory" error occurs | All | While importing a large number of records, if you get the
|
A "too many script statements" error occurs | All | When you run an import, if you receive "Too many script statements" as an onscreen Apex error message or in an email, reduce the batch size by 10 in the Batch Size field in the Settings section on the Salesforce Upsert window. |
Records are not being to your CMDB. | All | Raise a case with Salesforce to create a custom index on the Assembly ID field of the Base Element object. |
Importing users by using the failure rows file for LDAP | LDAP server | A failure row file is a text file that is saved in the folder where you have saved your KJB files. Perform the necessary steps to remove the error provided for the failure rows in the failure rows file, and then import the data by using the following steps. The following Pentaho package provides a transformation file in the FailureRowsInput folder. You can use this transformation file for importing users from the failure rows file to the Salesforce organization: LDAP integration with BMC Remedyforce (assign permission sets and Remedyforce managed package license): https://community.bmc.com/s/news/aA33n000000TPBRCA4/importing-users-from-an-ldap-server To import data from the failure rows file
|
Failed in writeToSalesForce: java.lang.IllegalArgumentException: The char '0x0' after 'Print' is not a valid XML character | All | Some unicode characters that cannot be parsed by the XML parser are present in any of the mapped fields. Either delete these characters in data or delete the mapping of such fields. |
Unable to query Salesforce | All | Check your Salesforce organization credentials in the Salesforce Upsert and Salesforce Input(CMDB_Classes) steps. |
The job does not appear in Atrium Integrator. | BMC Atrium CMDB | Ensure that you have saved your KTR and KJB files in a folder in Atrium Integrator. |
Error setting value #2 [ARDelta_1 Integer] on prepared statement (Integer) | BMC Atrium CMDB | This error is generated if multiple records are created in the BMC Remedy AR System NGIE:Delta form for the running transformation. Delete the additional ARDelta entries that are created for the error transformation file by deleting the records. To find the duplicate records for a transformation file, open the NGIE:Delta form in BMC Remedy AR System. In the TransName field, enter the transformation file name, and click Search. |
Did not find Remedy Application Service password for server<server name> in UDM:RAppPassword Form on server <server name> | BMC Atrium CMDB | Check your Atrium Server Connection credentials in the ARInput step. |
'Oracle Database Server 10g Release 2' is not valid for the type xsd: <data type > | BMC Atrium CMDB | Check the data type of the mapped fields in the Salesforce Upsert step. |
Duplicate LAN Endpoint entries are created in BMC Remedyforce CMDB. | BMC Client Management | In BMC Remedyforce Summer 15 Patch 1 and earlier versions, when you imported LAN Endpoint data from BMC Client Management, duplicate LAN endpoint entries might have been created in BMC Remedyforce CMDB. This duplicate data was created because of the following factors:
In BMC Remedyforce Summer 15 Patch 1 and later versions, instead of the Network Interface IP address, the MAC address is used as the unique source identifier for the LAN endpoints imported from BMC Client Management, and new duplicate LAN endpoint entries are not created in BMC Remedyforce CMDB. To resolve this issue, you must install the latest Pentaho package for BMC Client Management released with BMC Remedyforce Summer 15 Patch 1 or later. Also, you must manually delete any existing duplicate LAN endpoint entry on the Remedyforce CMDB. For more information, see Deleting configuration items. |
The HTTP Status 404 error is displayed when trying to connect to Salesforce. | All | Ensure that your Pentaho transformations connect to https://login.salesforce.com/services/Soap/u/<API version> instead of https://www.salesforce.com/services/Soap/u/<API version>. Starting from January 1, 2016, Salesforce retired www.salesforce.com as an API endpoint. For more information, see the announcement on BMC Communities website. To view a video demonstration of how to update your Pentaho transformations, see Salesforce API Endpoint Retirement. BMC Remedyforce has updated the API endpoint in the Pentaho packages that are currently available on the BMC Communities website. |
Error connecting to your Salesforce organization when using version 5.4 of the Pentaho Data Integration Tool. | All | Check whether the Require TLS 1.1 or higher for HTTPS connections Salesforce critical update is enabled in your organization. Pentaho Data Integration Tool 5.4 does not support TLS 1.1 and cannot connect to your Salesforce organization if this Salesforce critical update is enabled in the organization. To resolve this issue, perform one of the following actions:
|
Error running the Pentaho packages. In some cases, you might also not be able to open the Enter Mapping window by clicking Edit Mapping in the Salesforce Upsert window. | All | In the Pentaho packages provided by BMC Remedyforce, ensure that the API version in the Salesforce Webservice URL field in steps, such as Salesforce Upsert, is supported.
|
In the Salesforce Upsert window, when you click Edit Mapping, the following error message is displayed:
| All | Perform the following steps to resolve this issue: |
Related topics
Applying models while importing CIs and assets
Scheduling jobs to import data
Troubleshooting common issues when importing data
Starting the Dell KACE console
Known and corrected issues for Pentaho packages
Comments
Log in or register to comment.