Processing of large volume column data

To process data from Db2 columns that contain large volumes of data, Log Master performs the following tasks:
  • Allocates temporary VSAM files to store the large volume column data during processing

  • Allocates a VSAM file for each large volume column (or each partition of a large volume column) that occurs in the set of selected log records

  • Allocates additional VSAM files only when it fills the initial data set and all possible extents, but more data remains to be written

  • Includes column data in the generated output as summarized in Large volume column data in BMC AMI Log Master output files

  • Deletes the VSAM files at the end of processing, unless

    • The log scan specifies an output logical log file

    • An error occurs during allocation of a large volume column VSAM file, and the value of either the online interface field, batch syntax keyword, or installation option that defines duplicate data set handling is YES

To process data from large volume columns, your job or job step must have enough available memory to perform all normal processing, and to contain one row’s worth of column data for the largest XML or LOB column in your selected log records. As it processes the large volume VSAM files, Log Master performs additional disk I/O that can slow the product’s performance.

Before including data from large volume columns in your output, be aware of how much data your job or job step will encounter. Ensure that the job has enough disk space available to store all large volume data in your selected log records. To avoid allocating too much disk space at run time, adjust the maximum limit on the number of VSAM files for each large volume column, or partition of a large volume column, by changing the appropriate online interface field, batch syntax keyword, or installation option.


Was this page helpful? Yes No Submitting... Thank you

Comments