Troubleshooting common issues with importing data

The following table describes the troubleshooting tips that you can use to resolve common issues that you might face when importing data.

Error or issueApplies to the Pentaho package forDescription or procedure
Viewing logs of the import jobsAllLog files are created in the folder where you have saved the KJB files. Success and failure rows files are also created in the same folder. If a failure occurs, the error code and its description are provided in the failure row file. You can also use the failure row file to import data to Salesforce.
Not all records are importedAllDelete the delta time stamp file of the job that you ran, and run the job again.
An "out of memory" error occursAll

While importing a large number of records, if you get the OutofMemoryError or Java heap size error, increase the heap size in the Spoon.bat file.

  1. Go to the \pdi-ce-4.x.0-stable\data-integration or pdi-ce-5.0.1.A-stable\data-integration folder.
  2. Right-click the Spoon.bat file, and select Edit.
  3. Locate the following line and replace 512 with a higher value:
    If "%PENTAHO_DI_JAVA_OPTIONS%"=="" set PENTAHO_DI_JAVA_OPTIONS="-Xmx512m" "-XX:MaxPermSize=256m"
  4. Click Save.
  5. Relaunch the Spoon.bat file and rerun the job file.
A "too many script statements" error occursAllWhen you run an import, if you receive "Too many script statements" as an onscreen Apex error message or in an email, reduce the batch size by 10 in the Batch Size field in the Settings section on the Salesforce Upsert window.
Importing data by using the failure rows file for data sources
  • BMC BladeLogic Client Automation 
  • Dell KACE
  • Microsoft System Center Configuration Manager

A failure row file is a text file that is saved in the folder where you have saved your KJB files. Perform the necessary steps to remove the error provided for the failure rows in the failure rows file, and then import the data by using the following steps.

The following Pentaho packages provide a sample job file in the FailureRowsInput folder. You can use this sample job file as a base to create transformation files for importing data from the failure rows file to the Salesforce organization.

In the following steps, it is assumed that you are using the sample KTR file provided by BMC.

To import data from the failure rows file

  1. Double-click Failure file input for <CI type name>.
  2. In the Selected files field, enter the failure file name location and click OK.
  3. Copy the Upsert into class <CI type name> step from the appropriate KTR file and paste it into the transformation file that you have created to import data from the failure rows file.
    For example, if you are importing data for the operating system CI type, copy the Upsert into class operating system step from the KTR file for operating system.
  4. Click Run.
Importing users by using the failure rows file for LDAPLDAP server

A failure row file is a text file that is saved in the folder where you have saved your KJB files. Perform the necessary steps to remove the error provided for the failure rows in the failure rows file, and then import the data by using the following steps.

The following Pentaho packages provide a transformation file in the FailureRowsInput folder. You can use this transformation file for importing users from the failure rows file to the Salesforce organization.

To import data from the failure rows file

  1. Double-click and open the FailureRowsInput folder.
  2. Double-click and open the LDAP_FAILURE_ROWS_<YYYYMMDDHHMMSS> file.
    Where YYYYMMDDHHMMSS is the time stamp of the file.
    The first line in the file displays the header row and from the second line onward details of the failed import are displayed. 
  3. For each failed import record:
    1. Read the information under the headings Error Description, Error Fields, and Error Codes.
    2. Resolve the error.
  4. Open the FailureLDAP transformation file with the Spoon batch file of the Pentaho Data Integration tool.
  5. Double-click the Salesforce Upsert step, and enter your Salesforce organization username and password in the Connection section.
  6. (Optional) To verify the connection, click Test connection.
  7. Click OK.
  8. To save the FailureLDAP transformation file, click .
  9. Click .
  10. In the Execute a transformation window, click Launch.
Failed in writeToSalesForce: java.lang.IllegalArgumentException: The char '0x0' after 'Print' is not a valid XML characterAllSome unicode characters that cannot be parsed by the XML parser are present in any of the mapped fields. Either delete these characters in data or delete the mapping of such fields.
Unable to query SalesforceAllCheck your Salesforce organization credentials in the Salesforce Upsert and Salesforce Input(CMDB_Classes) steps.
The job does not appear in Atrium Integrator.BMC Atrium CMDBEnsure that you have saved your KTR and KJB files in a folder in Atrium Integrator.
Error setting value #2 [ARDelta_1 Integer] on prepared statement (Integer)BMC Atrium CMDBThis error is generated if multiple records are created in the BMC Remedy AR System NGIE:Delta form for the running transformation. Delete the additional ARDelta entries that are created for the error transformation file by deleting the records. To find the duplicate records for a transformation file, open the NGIE:Delta form in BMC Remedy AR System. In the TransName field, enter the transformation file name, and click Search.
Did not find Remedy Application Service password for server<server name> in UDM:RAppPassword Form on server <server name>BMC Atrium CMDBCheck your Atrium Server Connection credentials in the ARInput step.
'Oracle Database Server 10g Release 2' is not valid for the type xsd: <data type >BMC Atrium CMDBCheck the data type of the mapped fields in the Salesforce Upsert step.
Was this page helpful? Yes No Submitting... Thank you

Comments