Information

This site will undergo maintenance on Friday, 26 September at 2:00 AM CDT / 12:30 PM IST and may experience a short period of instability during that time.

Default language.

Importing users with the Salesforce Platform license from an LDAP server


You can import users into your Salesforce organization by using the Pentaho Data Integration tool. The Pentaho package for importing users with the Salesforce Platform license from an LDAP server is available on the BMC Communities website. For information about how users are imported into your Salesforce organization, see Overview-of-how-users-are-imported-from-LDAP-servers.

The following topics provide information about importing users with the Salesforce Platform license from an LDAP server into your Salesforce organization:

Warning

Note

For information about the LDAP versions that BMC Remedyforce supports, see Supported-browsers-mobile-devices-and-integrations.

Before you begin

Before you can import users from an LDAP server, you must perform the following tasks:

To import users with Salesforce Platform License from an LDAP server

Warning

Important

If you have enabled the setting to access your Salesforce organization from limited IP addresses, you must append the security token to your Salesforce organization password.

For example, if the password for your Salesforce organization is mypassword and your security token is XXXXXXXXX, specify mypasswordXXXXXXXXX in the Password fields.

  1. To launch the Pentaho Data Integration tool, perform the following actions:
    1. Navigate to the location where you downloaded and unzipped the Pentaho Data Integration tool.
    2. Navigate to the data-integration folder and double-click the Spoon.bat file.
  2. In Pentaho Spoon, select File > Open, navigate to the folder where you downloaded the Pentaho packages, and open the TransferLDAPInfo.kjb file.
  3. On the TransferLDAPInfo tab, in the Transfer Lookup Excel Export step (transformation), provide your Salesforce organization access details.

    Click here to see the detailed steps.
    1. Right-click the Transfer LookUp Excel Export step, and select Open referenced object > Transformation.
    2. On the LookUp Excel Export tab, double-click one of the following steps:
      • Salesforce Input[For Profile]
      • Salesforce Input[From Account]
      • Salesforce Input[From UserRole]
      • Salesforce Input[For User Account Link]
    3. In the Salesforce Input window, enter your Salesforce organization user name and password.
    4. (Optional) To verify the connection, click Test connection and then click OK.
    5. To save your changes and close the Salesforce Input window, click OK.

    6. Repeat step 3b to step 3e for the other steps listed in step 3b.
    Warning

    Note

    BMC recommends that you do not change the default API version in the Salesforce Webservice URL.

  4. Click the TransferLDAPInfo tab.
  5. In the Update client user import table with LDAP user information step (transformation), provide your LDAP server and Salesforce organization access details and, if required, customize the Pentaho package.

    Click here to see the detailed steps.

    Step

    Action

    Details

    a.

    Open the Update client user import table with LDAP user information transformation.

    Right-click the Update client user import table with LDAP user information step, and select Open referenced object > Transformation.

    b.

    Provide LDAP server access details.

    On the TransferLDAPInfo tab, perform the following actions:


      1. Double-click the LDAP input step.
      2. In the LDAP Input window, on the General tab, enter the host, user name, and password of the LDAP server from which you are importing users.
      3. (Optional) To verify the connection, click Test connection and then click OK.
      4. (Optional) To fetch more fields from the LDAP server, on the Fields tab, click Get fields.
      5. To save your changes and close the LDAP Input window, click OK.

    c.

    (Optional) Customize the assignment of account and profile information to imported records based on specific LDAP attributes.

    Perform the following actions:


      1. Double-click the Dynamic account and profile assignment step.
      2. Perform any of the following actions:
        • Modify or assign the default value for account, profile, role, and custom values for specific conditions in the script.
        • Modify the default value for the LocaleLanguageTimeZone, and EmailEncoding fields.
      3. Click OK.

    d.

    Provide the Salesforce organization access details.

    Perform the following steps:


      1. Double-click one of the following steps:
        • Salesforce Upsert [For User]
        • Salesforce Insert [For User Account Link]
        • Salesforce Update
      2. In the window that opens, enter your Salesforce organization user name and password.
      3. (Optional) To verify the connection, click Test connection and then click OK.
      4. To save your changes and close the window, click OK.
      5. Repeat step 5d-i to step 5d-iv for the other steps listed in step 5d-i.

    Note: BMC recommends that you do not change the default API version in the Salesforce Webservice URL.

    e.

    (Optional) Update the predefined mapping between the LDAP fields and the Salesforce User object.

    Perform the following actions:


      1. Double-click the Salesforce Upsert [For User] step.
      2. In the Salesforce Upsert window, click Edit Mapping.
      3. In the Enter Mapping window, update mappings for other fields based on your requirements.
        For more information about mapping, see Overview-of-how-users-are-imported-from-LDAP-servers.
      4. To save your settings and close the Enter Mapping window, click OK.
      5. To save your changes and close the Salesforce Upsert window, click OK.
  6. To save the KTR and KJB files, click Save pentaho6_saveicon.png.
  7. In the TransferLDAPInfo.kjb file, click Run this job pentaho6_runjobicon.png.
  8. Perform one of the following actions based on the Pentaho version that you are using:

    Pentaho version

    Action

    6.1

    In the Run Options window, click Run.

    5.4

    In the Execute a job window, click Launch.

    Transformation status is depicted by using the following icons:

    • pentaho6_successfulicon.png— Complete
    • pentaho6_runningicon.png — Running
    • pentaho6_failureicon.png — Unsuccessful
  9. (Optional) To view logs, in the Execution results section, click the Logging tab.
    All errors are displayed in red.

To schedule jobs to import data

On the BMC Communities website, the Pentaho packages for importing data from external data sources into  page contain a SchedulingFiles folder. This folder contains a sample batch file that you can use as a base to create your batch files for scheduling data import. For more information about scheduling a job, see http://wiki.Pentaho.com/display/EAI/Kitchen+User+Documentation.

Warning

Note

To understand the job scheduling process, refer to the following procedure. This procedure is an example to be used as a reference for all other packages.

To schedule the Remedyforce LDAP Pentaho package to import users, perform the following steps:

(For all other required packages, perform these steps by using the respective batch files.)

  1. Navigate to the Remedyforce Package folder.
  2. Rename your extracted package folder to SchedulingFiles.
  3. Open the folder where you have installed the Pentaho Data Integration tool and minimize this folder.
  4. Open the SchedulingFiles folder. 
    The folder contains two .bat files.
  5. To create a copy of the batch file, copy SchedulingTransferLDAPInfo.bat file and paste it in the same folder.
  6. Right-click the SchedulingTransferLDAPInfo.bat file and click Edit
    The SchedulingTransferLDAPInfo.bat notepad opens.
  7. Minimize the notepad and open the Pentaho data-integration folder as mentioned in Step 3.
  8. Copy the data-integration folder address and navigate to the notepad as mentioned in Step 6.

    The following figure shows an example of the updated SchedulingTransferLDAPInfo.bat file.

    img_sched_pentaho.PNG

  9. Before the first line of the notepad, type cd and paste the copied data-integration folder address . (Make sure to add a space betweeen cd and the folder address).
  10. Paste the data-integration folder address in the Kettle Home folder address and remove the double quotes ("").
  11. Paste the data-integration folder address in the kitchen.bat file address by replacing only the part before \kitchen.bat.
  12. Navigate to the Remedyforce Package folder and copy the address for TransferLDAPInfo.kjb.
  13. Paste the TransferLDAPInfo.kjb address in between \kitchen.bat /file: and /level:DetailedRemove the double quotes ("") and any additional spaces between the address.
  14. Save the notepad.
     
To prepare for scheduling
  1. Open the Command Prompt as an administrator.
  2. Type cd and then paste the SchedulingFiles folder address with a space in between the two. 
  3. Copy and paste the .bat file name (SchedulingTransferLDAPInfo.bat).
    The job starts running.
To schedule the job
  1. Go to Start > Control Panel > Administrative Tools > Task Scheduler.
  2. In the Actions section, click Create Task.
  3. In the General tab, enter the name and description.
  4. Select the Run with highest privileges check box, and click OK.
  5. On the Triggers tab, click New, select the appropriate option for starting the task, and click OK.
    For example, you can select Daily or When the computer starts as the option for starting the task.
  6. On the Actions tab, click New.
  7. Click Browse, select SchedulingTransferLDAPInfo.bat, and click OK.

KJB and KTR files for importing users from an LDAP server

In Pentaho, metadata is stored in XML format in the file system as KTR (transformations) or KJB files (jobs). The Pentaho package includes the TransferLDAPInfo.kjb file (job file) for importing users from an LDAP server. The job (KJB) file contains a series of transformations that run in a sequence. Each transformation maps to a KTR file that is available, along with the KJB file, in the Pentaho package.

The following table provides information about the KTR files and the corresponding transformations that the KJB file contains:

KTR file

Step or transformation in the KJB file

Description

LookUpExcelExport.ktr

Transfer Lookup Excel Export

Exports profiles, accounts, user role, and user-account links from the Salesforce organization into separate CSV files.

None

Check if delta timestamp file exists

Checks if any time stamp file exists.

The Pentaho package utilizes a time stamp file to determine which records were added or modified since the last time the job was run.

This step is used for incremental import. If a time stamp file does not exist, the Create the initial timestamp file step is run; otherwise, the Update client user import table with LDAP user information step.

CreateInitialTimeStampedFileForLDAP.ktr

Create the initial time stamp file

Creates the time stamp file to record the time of import.

This step is run only if you are importing for the first time or you have deleted the existing time stamp file.

TransferLDAPInfo.ktr

Update client user import table with LDAP user information

Transfers data from LDAP server to the Salesforce organization.

StoreLDAPTimestamp.ktr

Store the current timestamp

If the data import is successful, the time of import is saved.

The following table provides information about the steps that are included in the TransferLDAPInfo.ktr file (Update client user import table with LDAP user information transformation). The Pentaho package runs these steps to import users from the LDAP server into your Salesforce organization. You can view these steps only when you open the KTR file in the Pentaho Data Integration tool.

The KTR file also contains mapping of the fields in the LDAP server to the fields in the User table. For more information about mapping, see Overview-of-how-users-are-imported-from-LDAP-servers.

Step

Description

Delta timestamp

Reads the saved time stamp.

Create time based LDAP filter string

Creates a time-based LDAP filter string that is used to fetch the new records added since you last ran the job successfully.

LDAP input

Uses the LDAP filter string to fetch the defined attributes of the records from the LDAP server.

Dynamic account and profile assignment

Enables assigning account and profile information to the records that are imported based on any of the LDAP attribute.

Sort

Sorts the users that are imported from the LDAP server.

Find Unique Rows

Checks that the imported users are unique so that duplicate records are not created in your Salesforce organization.

Excel Output

Creates a CSV file that contains unique imported users.

Stream lookup [For Profile]

Retrieves profiles that were exported from the Salesforce organization to a CSV file in the Transfer LookUp Excel Export transformation (LookUpExcelExport.ktr file)

Stream lookup [For UserRole]

Retrieves user roles that were exported from the Salesforce organization to a CSV file in the Transfer LookUp Excel Export transformation (LookUpExcelExport.ktr file).

Stream lookup [For Account]

Retrieves accounts that were exported from the Salesforce organization to a CSV file in the Transfer LookUp Excel Export transformation (LookUpExcelExport.ktr file).

Salesforce Upsert [For User]

Transfers data from the LDAP server to the Salesforce organization.

Filter rows[Account Id is present]

Retrieves only those imported user records that have an associated account.

Stream lookup

Retrieves the user-account links that were exported from the Salesforce organization to a CSV file in the Transfer LookUp Excel Export transformation (LookUpExcelExport.ktr file). 

Filter rows[For User Account Link]

Retrieves only those imported user records that have an existing user-account link in the Salesforce organization.

Salesforce Insert[For User Account Link]

Creates a user-account link for imported users who do not have an existing user-account link in the Salesforce organization.

Junction Insert Success Rows

Stores the rows that are imported successfully.

Junction Insert Failure Rows

Stores the rows that are not imported with error code, error description, and error fields.

Salesforce Update

Updates the user-account link for imported users who have an existing user-account link in the Salesforce organization.

Junction Update Success Rows

Stores the rows that are imported successfully.

Junction Update Failure Rows

Stores the rows that are not imported with error code, error description, and error fields.

Troubleshooting

The following table describes the troubleshooting tips that you can use to resolve common issues that you might face when importing data.

 Error or issue

Applies to the Pentaho package for

Description or procedure

While importing LAN endpoint records from BMC Client Management, the import process throws an exception.

BMC Client Management

If the network interface information is available for LAN endpoint class records, they are imported from BMC Client Management. LAN endpoint class records without the network interface information are filtered out during the import process and are not inserted into Remedyforce. 

  • If you install Remedyforce 20.22.02 or upgrade to it, this feature is available by default. Administrators need not configure any settings. 
  • If you use Pentaho packages for importing data, you must download the latest version of the following packages from BMC Communities:
    • BMC FootPrints Asset Core CMDB 2.0 for BMC Client Management 12.0: click here to download.
    • BMC FootPrints Asset Core WebServices CMDB 2.0: click here to download.

For out-of-the-box Pentaho packages, you do not have to configure any settings.
For customized Pentaho packages, you must add the following filter:



      • Job: TransferAssetCoreLANEndPointinfotoCMDB.kjb
      • Transformation: TransferAssetCoreLANEndPointinfotoCMDB.ktr
      • Step: Filter rows
        filter_to_add_in_Pentaho_packages.PNG

         

For details about the filter, download and refer to the Pentaho packages.

Viewing logs of the import jobs

All

Log files are created in the folder where you have saved the KJB files. Success and failure rows files are also created in the same folder.

If a failure occurs, the error code and its description are provided in the failure row file. You can also use the failure row file to import data to Salesforce.

Not all records are imported

All

Delete the delta time stamp file of the job that you ran, and run the job again.

Import fails

All

If you have upgraded to Remedyforce Winter 17 (20.17.01), either you use the latest Pentaho packages updated on the BMC Communities or map a value to the Source field of the Base Element object. Also, ensure that you have Edit permission on the Source field of the Base Element object.

An "out of memory" error occurs

All

While importing a large number of records, if you get the OutofMemoryError or Java heap size error message, increase the heap size in the Spoon.bat file.

  1. Navigate to the location where you downloaded and unzipped the Pentaho Data Integration tool.
  2. Navigate to the data-integration folder.
  3. Right-click the Spoon.bat file, and select Edit.
  4. Locate the following line and replace 512 with a higher value:
    If "%PENTAHO_DI_JAVA_OPTIONS%"==""
    set PENTAHO_DI_JAVA_OPTIONS=
    "-Xmx512m" "-XX:MaxPermSize=256m"
  5. Click Save.
  6. Relaunch the Spoon.bat file and rerun the job file.

A "too many script statements" error occurs

All

When you run an import, if you receive "Too many script statements" as an onscreen Apex error message or in an email, reduce the batch size by 10 in the Batch Size field in the Settings section on the Salesforce Upsert window.

Records are not being to your CMDB.

All

Raise a case with Salesforce to create a custom index on the Assembly ID field of the Base Element object.

Importing users by using the failure rows file for LDAP

LDAP server

A failure row file is a text file that is saved in the folder where you have saved your KJB files. Perform the necessary steps to remove the error provided for the failure rows in the failure rows file, and then import the data by using the following steps.

The following Pentaho package provides a transformation file in the FailureRowsInput folder. You can use this transformation file for importing users from the failure rows file to the Salesforce organization:

LDAP integration with Remedyforce (assign permission sets and Remedyforce managed package license): https://community.bmc.com/s/news/aA33n000000TPBRCA4/importing-users-from-an-ldap-server

To import data from the failure rows file

  1. Double-click and open the FailureRowsInput folder.
  2. Double-click and open the LDAP_FAILURE_ROWS_
    <YYYYMMDDHHMMSS>
    file.
    Where YYYYMMDDHHMMSS is the time stamp of the file.
    The first line in the file displays the header row and from the second line onward details of the failed import are displayed. 
  3. For each failed import record:
    1. Read the information under the headings Error Description, Error Fields, and Error Codes.
    2. Resolve the error.
  4. Open the FailureLDAP transformation file with the Spoon batch file of the Pentaho Data Integration tool.
  5. Double-click the Salesforce Upsert step, and enter your Salesforce organization username and password in the Connection section.
  6. (Optional) To verify the connection, click Test connection.
  7. Click OK.
  8. To save the FailureLDAP transformation file, click pentaho_save.gif.
  9. Click run_green.gif.
  10. In the Execute a transformation window, click Launch.

Failed in writeToSalesForce: java.lang.IllegalArgumentException: The char '0x0' after 'Print' is not a valid XML character

All

Some unicode characters that cannot be parsed by the XML parser are present in any of the mapped fields. Either delete these characters in data or delete the mapping of such fields.

Unable to query Salesforce

All

Check your Salesforce organization credentials in the Salesforce Upsert and Salesforce Input(CMDB_Classes) steps.

The job does not appear in Atrium Integrator.

BMC Atrium CMDB

Ensure that you have saved your KTR and KJB files in a folder in Atrium Integrator.

Error setting value #2 [ARDelta_1 Integer] on prepared statement (Integer)

BMC Atrium CMDB

This error is generated if multiple records are created in the BMC Remedy AR System NGIE:Delta form for the running transformation. Delete the additional ARDelta entries that are created for the error transformation file by deleting the records. To find the duplicate records for a transformation file, open the NGIE:Delta form in BMC Remedy AR System. In the TransName field, enter the transformation file name, and click Search.

Did not find Remedy Application Service password for server<server name> in UDM:RAppPassword Form on server <server name>

BMC Atrium CMDB

Check your Atrium Server Connection credentials in the ARInput step.

'Oracle Database Server 10g Release 2' is not valid for the type xsd: <data type >

BMC Atrium CMDB

Check the data type of the mapped fields in the Salesforce Upsert step.

Duplicate LAN Endpoint entries are created in Remedyforce CMDB.

BMC Client Management

In Remedyforce Summer 15 Patch 1 and earlier versions, when you imported LAN Endpoint data from BMC Client Management, duplicate LAN endpoint entries might have been created in Remedyforce CMDB. This duplicate data was created because of the following factors:

  • In BMC Client Management, a LAN endpoint could periodically obtain multiple IP addresses using DHCP.
  • In Remedyforce Summer 15 and earlier, the Network Interface IP address was used as the unique source identifier for LAN endpoints imported from BMC Client Management.

In Remedyforce Summer 15 Patch 1 and later versions, instead of the Network Interface IP address, the MAC address is used as the unique source identifier for the LAN endpoints imported from BMC Client Management, and new duplicate LAN endpoint entries are not created in BMC Remedyforce CMDB.

To resolve this issue, you must install the latest Pentaho package for BMC Client Management released with Remedyforce Summer 15 Patch 1 or later. Also, you must manually delete any existing duplicate LAN endpoint entry on the Remedyforce CMDB. For more information, see Deleting-configuration-items.

The HTTP Status 404 error is displayed when trying to connect to Salesforce.

All

Ensure that your Pentaho transformations connect to https://login.salesforce.com/services/Soap/u/<API version> instead of https://www.salesforce.com/services/Soap/u/<API version>.

Starting from January 1, 2016, Salesforce retired www.salesforce.com as an API endpoint. For more information, see the announcement on BMC Communities website. To view a video demonstration of how to update your Pentaho transformations, see Salesforce API Endpoint Retirement.

Remedyforce has updated the API endpoint in the Pentaho packages that are currently available on the BMC Communities website.

Error connecting to your Salesforce organization when using version 5.4 of the Pentaho Data Integration Tool.

All

Check whether the Require TLS 1.1 or higher for HTTPS connections Salesforce critical update is enabled in your organization.

Pentaho Data Integration Tool 5.4 does not support TLS 1.1 and cannot connect to your Salesforce organization if this Salesforce critical update is enabled in the organization.

To resolve this issue, perform one of the following actions:

  • Use version 6.1 of the Pentaho Data Integration Tool to import data from various data sources to Remedyforce.
  • Deactivate the Require TLS 1.1 or higher for HTTPS connections Salesforce critical update in your Salesforce organization.
    For information about when the critical update will be automatically activated, see Salesforce Knowledge Article Number 000232871.

Error running the Pentaho packages.

In some cases, you might also not be able to open the Enter Mapping window by clicking Edit Mapping in the Salesforce Upsert window.

All

In the Pentaho packages provided by Remedyforce, ensure that the API version in the Salesforce Webservice URL field in steps, such as Salesforce Upsert, is supported.

  • In the LDAP Pentaho packages, the default API version in the Salesforce Webservice URL field is 51 (https://login.salesforce.com/services/Soap/u/51).
  • In all other Pentaho packages, the default API version in the Salesforce Webservice URL field is 51 (https://login.salesforce.com/services/Soap/u/51).

In the Salesforce Upsert window, when you click Edit Mapping, the following error message is displayed:

Certain referenced fields were not found!

All

Perform the following steps to resolve this issue:

  1. In the Salesforce Upsert window, click Edit Mapping.
    The Certain referenced fields were not found! error message is displayed.
  2. To open the Enter Mapping window, click OK.
    The tool removes all referenced fields from the existing mappings. The Pentaho package provided by Remedyforce includes only one referenced field, BMCServiceDesk__CMDB_Class__c. The CMDB_Class field is no longer available in the Mappings column.
  3. In the Source fields column, select CMDB_Class and click Add.
    When you move CMDB_Class from the Source field column to the Mappings column, the c in the field name is replaced with r. For example, BMCServiceDesk__CMDB_Class__c is replaced with BMCServiceDesk__CMDB_Class__r.
  4. (Optional) Update mappings for other fields based on your requirements.
    For information about updating the out-of-the-box mapping, see Field mapping in CMDB 2.0.
  5. To save your settings and close the Enter Mapping window, click OK.
  6. In the Salesforce Upsert window, in the Module field column in the Fields area, click BMCServiceDesk__CMDB_Class__r, and replace r with c.
  7. To save your changes and close the Salesforce Upsert window, click OK.

Related topics

Overview-of-how-users-are-imported-from-LDAP-servers

Scheduling-jobs-to-import-data

Troubleshooting-common-issues-when-importing-data

Known-and-corrected-issues-for-Pentaho-packages

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*

Remedyforce 20.25.02