Troubleshooting issues with large tables in AR System


You might face an issue while running queries on tables that have a large amount of data.

To resolve issues with large tables in AR System, follow the guidelines described in this topic, see Resolutions for common issues, or create a BMC Support case.


Related topic

Symptom

You notice slowness in AR System when you perform specific actions related to the tables containing a large amount of data.
For example, you might notice slowness when you query the following forms:

  • BMC.CORE:BMC_BaseElement
  • HPD:Help Desk
  • AR System Email Messages


Scope

  • One or more users across multiple servers in a server group experience the symptom.
  • The backend processes are affected instead of user activities.

Diagnosing and reporting an issue

Perform the following steps:

Task

Action

Steps

Reference

1.

Identify the inefficient SQL statement

  1. Use the troubleshooting guide for AR Log Analyzer or database tools to identify tables that impact performance.
  2. Use the Server Statistics - Longest Running API and SQL feature to identify the SQL statements that run for a long time .
  3. In AR Log Analyzer, use the SQL Aggregates Group by Table report to identify tables that consume a lot of time.

    sql-call-aggregates.png
    The default sorting of Average Execution Time (AVG Time in the preceding table) puts the tables that are generally the longest to query on top.

    This example shows that T4065 has a total of nine SELECT statements that have a maximum of 41.488 seconds and an average of 14.231 seconds. Also, T4064 has 19 SELECT statements with a maximum of almost 20 seconds. In this log, these are the busiest tables.

    Click the links in blue text to see the exact SQL statement that took the longest to complete.

2.

Check for Row-Level-Security

If you notice Row-Level-Security access control occurring, see the troubleshooting guide for Row-Level-Security queries.

You can identify if Row-Level-Security access control is occurring by the following statements in the WHERE clause:

Combined LIKES

WHERE (((T4387.C1000000337 = 'user') OR (T4387.C1000000338 = 'user')) AND ((T4387.C2 = 'user') OR
(T4387.C60600 LIKE '%;'user';%') OR (T4387.C60600 LIKE '%;0;%') OR (T4387.C60600 LIKE '%;20032;%') OR
(T4387.C60600 LIKE '%;20225;%') OR (T4387.C60600 LIKE '%;13006;%') OR (T4387.C60600 LIKE '%;13007;%') OR
(T4387.C60600 LIKE '%;13010;%') OR (T4387.C60600 LIKE '%;1000000463;%') OR (T4387.C60600 LIKE '%;20055;%') OR
(T4387.C60600 LIKE '%;20313;%') OR (T4387.C60600 LIKE '%;1000000000;%') OR (T4387.C60600 LIKE '%;20315;%') OR
(T4387.C60600 LIKE '%;20316;%') OR (T4387.C60600 LIKE '%;20061;%') OR  

RLS_Split/regex

WHERE ((T1436.C2 = 'user') OR (T1436.C4 = 'user') OR EXISTS(SELECT 1 FROM(SELECT Item FROM RLS_split(T1436.C60901, ';')
UNION ALL SELECT Item FROM RLS_split(T1436.C112, ';') UNION ALL SELECT Item FROM RLS_split(T1436.C60989, ';')
UNION ALL SELECT Item FROM RLS_split(T1436.C60900, ';'))
privs join RLS_split '1000000023;1000000024;1000000021;1000000027;804;0;1058;20078;20055;20077;20032;', ';')
u ON privs.Item = u.Item)))...
Or
...WHERE ((T3610.C2 = 'user') OR (T3610.C4 = 'user') OR (EXISTS (SELECT TO_CHAR(item) FROM
(SELECT regexp_substr(str, '[^;]+', 1, rownum) item FROM
(SELECT '''user'';0;-20225;-20213;-20000;440;803;804;1058;13006;13007;13010;20000;20003;20031;20032' str FROM dual)
CONNECT BY level <= length (regexp_replace (str, '[^;]+'))  + 1)
u_privs WHERE item IS NOT NULL INTERSECT ( SELECT TO_CHAR(regexp_substr (T3610.C112, '[^;]+', 1, rownum))
item FROM dual CONNECT BY level <= length (regexp_replace (T3610.C112, '[^;]+'))  + 1) ) OR...

3.

Check for a custom workflow

  • Analyze the longest SQL statement and check if it is from a custom workflow. You can modify a custom workflow to use more efficient SQL statements.
  • See the troubleshooting guide for SQL tuning issues to tune your SQL.

4.

Identify if the tables in question are large

Use database tools to calculate the number of records and also the physical size of the tables. Alternatively, use a third-party tool, such as APL DB Size to get useful information.

The results show that in this case, the TestForPerformance form that corresponds to T4064 contains 12.5 million records. Note that T4065 was a join of T4064. Therefore, the same underlying table is the root cause of performance issues.

apl-db-size.png

To find the Form ID, right-click the form name and then select Show Form Details. Prefix the Form ID with a T to get the table name, for example, T4064 in this case.

test-for-performance.png

In this example, T4064 is the underlying table for the TestForPerformance form.

5.

Determine if the table size is the cause of poor performance

In many cases, the size of a table is not an issue and you might need to use another performance troubleshooting guide to resolve the issue. However, if you identify very large tables, you might want to prepare a plan to reduce the size of the tables in your production database.

To identify if size is the problem, it might require a Database Administrator (DBA) to look at the database server statistics to check whether loading data pages from the disk for the table or consuming large amounts of memory is causing the performance issue.

Alternatively, look at the long running SQL statements and see if they can help to understand the nature of the problem. Usually, the SELECT statement that efficiently uses indexes can perform well even if there is a huge data. The problem often is that the results set is large. Therefore, focus on the SQL statements that return a substantial amount of data.

6.

Consider options for reducing the size of the table

There are several ways to deal with large tables, depending on the nature of the table. Consider the retention policies at your organization before following any of these options:

Important: Before making database changes, consult your DBA and make sure that a backup has been taken.

We recommend to use the following options:

  • Use the archiving feature in AR System to delete or move data from the form.
     Important: Some out-of-the-box (OOTB) forms already have archiving configured. Use AR System Archive Manager Console to enable and configure archiving on those forms.
  • Use the archive feature to delete old records.
  • Use the archive feature to copy old records to an Archive form and remove those records from the main form.
    This method can move a large amount of data into the form that is not the primary form for querying and updating data. This can help to improve performance while still keeping the data available.
     Important: Archiving can take days or weeks to complete depending on the amount of data. You can use the Num-Archive-Threads parameter to improve performance when archiving a large amount of data.
  • Use a third-party tool to delete records.

Alternatively, you can use the following options:

  • Use a AR System client to query old records and delete them. This is usually not possible or practical on very large tables because you can only access a small percentage of records at a time.
  • Use a database client to delete records from the table based on time criteria, such as Closed Date or Create Date.
     Important: Make sure you have a valid database backup before making changes at the database. Update database statistics after the deletion is complete.
  • Consider the nature of the form associated with the table and make sure that by deleting records, you are not creating a broken parent-child relationship. If this is the case, consider archiving options. Also, remove the related history and attachment records from their respective tables (H, B, BC) whenever applicable to avoid leaving orphaned records.
     Important: If the table is extremely large, for example, having more than ten million records, deleting records using a Time Criterion might take an excessive amount of time and it is not practical. The DBA might want to use a strategy where you copy the records that you want to keep to a temporary table, truncate the table, and then move the copied records back to the main table. Though more complex, if you need to keep only a small percentage of records, this method might be much faster. Make sure to update database statistics.
  • In some cases, truncate the table when the data is only temporary or is no longer needed.

7.

Find a solution


9.

Create a BMC Support case

Collect and send output of AR Log Analyzer and the detailed information when you create a case with BMC Support:

Provide the following information as part of your case:

  • One or more users who have experienced the problem
  • Approximate timestamp when the slowness of SQL was noticed
  • The SQL statement in question
  • Any recent changes to the application, database, or other components in your environment

Attach any material or reports that are provided by the DBA. This includes Automatic Workload Repository (AWR), Automatic Database Diagnostic Monitor (ADDM), or Active Session History (ASH) reports, MS-SQL Profiler output, execution plans, and observations, etc.


Resolutions for common issues

After you determine a specific symptom or an error message, use the following table to identify the solution:

Symptom

Action

Reference

In the database, attachment fields use a separate table to store data even though you use Oracle-Clob-Storage-In-Row.


NA

Important: Oracle-Clob-Storage-In-Row doesn't affect attachment fields because they don't use CLOBs.

In AR System, archiving is very slow.

Check for the missing indexes.

In the SQL logs, queries against large tables are slow when you use Row-Level-Security.


Use the new Subquery Row Level Security (RLS) option.

Long text fields cause the database to grow very large.


Store CLOBs In-Row to reduce the size occupied in the database.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*