This documentation supports the 9.1 to 9.1 Service Pack 3 version and its patches of BMC Atrium Core. The documentation for version 9.1.04 and its patches is available here.

To view the latest version, select the version from the Product version menu.

Dynamic dataset and qualification values in event-driven reconciliation jobs

When you execute a reconciliation job from either a BMC Atrium API program or a Run Process workflow action you have the option of dynamically specifying datasets, class qualifications, or both for the job.

Substituting datasets

Substituting datasets works for any reconciliation activity type, and for any dataset specified in the activity. You specify pairs of dataset IDs, where one represents the defined dataset that is saved in the activities in the job and the other represents the working dataset to use in place of the defined dataset during this run. You can specify as many dataset pairs as you want for a job run.

For example, you have a job that includes an Identify activity identifying Dataset 1 against Dataset 2 and Dataset 3, and a Merge activity that merges Dataset 1 and Dataset 2 into Dataset 3. On certain occasions, you want to use the Identification rules and Precedence sets defined in these activities to identify and merge source datasets 4 and 5 into the same target, or you want to merge the original sources into a different target. The following figure illustrates these scenarios.


Consider using this feature when working with overlay datasets. For example, you can use it to test the reconciliation of several different test states, merging from a different overlay source dataset into a different overlay target dataset for each job run.

Dynamic dataset substitution


If you use dynamic dataset substitution on a job containing a Merge activity, the dataset ID stored in the AttributeDataSourceList attribute is that of the defined dataset, not the working dataset. For more information about AttributeDataSourceList, see Merging datasets into a reconciled target dataset.


Any dataset pair you supply when executing a job must be valid for every activity in the job. If you supply a pair with a defined dataset that is not used in one or more activities, the entire job run fails.

If you have jobs that contain several different datasets, consider breaking them up into multiple jobs to avoid the requirement that a defined dataset must exist in every activity. When you need to use dynamic dataset substitution, you can then call the jobs separately and pass appropriate dataset pairs. When you do not need this flexibility, schedule an umbrella job that calls each piece with an Execute activity.

For instructions for using this feature with workflow, see Executing workflow against instances after a reconciliation Compare activity. For instructions for using this feature with an API program, see the Reconciliation Engine functions in Types of BMC Atrium CMDB C API functions.

Substituting qualifications

When you substitute a qualification, it replaces all Qualification Sets used in the job. This allows you to run a job against a different subset of data each time. You specify each substitute qualification for a particular class, and can specify as many as you want for a job run.


Consider using this feature when you've created or modified a small number of instances in a provider dataset. After creating or modifying the data, you can run your usual reconciliation job that identifies and merges the dataset, but substitute qualifications that restrict the job to only the data you just worked with.

For example, you have a job that identifies and merges all active CIs in two datasets, then copies some of that data to a third dataset. You've just discovered several new computer systems and printers, or perhaps just computer systems, and want to reconcile them the same way. The following figure illustrates qualification substitution for both scenarios.

Dynamic qualification substitution

Was this page helpful? Yes No Submitting... Thank you