Best practices to convert nonstandard customizations to standard customizations


A nonstandard customization is a change made in the system without using a BMC-provided or recommended mechanism or API. Having nonstandard customizations in the system can cause functional or performance issues during upgrades or regular system use.

Use this topic to review the best practices to convert nonstandard customizations to standard customizations. This topic also includes best practices to improve the performance of workflows, escalations, filters, and applications.

For example, the following nonstandard customizations and integrations are not supported on BMC Helix environments:

  • Providing direct access to update the database forms or tables
  • Processing files and scripts through workflows
  • Viewing forms or tables
  • Running direct SQL updates through workflows

Also, the following customizations affect the performance of applications and workflows:

  • Escalations that update huge amounts of data in a single transaction
  • Queries that do not have valid qualifications or do not use appropriate indexes


Best Practice

We recommend that you convert nonstandard customizations to standard customizations before you perform any upgrade to BMC Helix SaaS to avoid upgrade failure. R eview these best practices for converting nonstandard customizations to standard customizations.

If you are migrating your on-premises systems to BMC Helix SaaS, we recommend that you convert the nonstandard customizations to standard customizations when you are performing the steps in Stage 3: Development activities. For more information, see the Migration process for Remedy on-premises to BMC Helix Innovation Suite Cloud.

Best practices for active links and menus

Active links are client-side workflows that communicate with the server to execute various business logic. However, several factors can impact performance and security, requiring careful review.

It is important to review the number of queries and updates (push fields and set fields) that active links are performing as multiple calls from the client to the server can result in performance issues.  This is because each round trip call has to go over the internet and any network latency will have a cumulative effect in making the client seem slow. A more efficient approach for SaaS is to move all queries to the server and use the Service construct in the AR System to handle these queries and return results efficiently.

Additionally, direct SQL calls from Active Links and menus are never supported in SaaS due to security concerns; these calls can be intercepted and potentially perform actions that can breach the application security. If your application requires direct SQL functionality that other mechanisms cannot support, you must redesign the SQL in active links to utilize Service actions running on the server.

You must:

  1. Obtain the necessary approvals from BMC.
  2. Change the direct SQL statement to filter set fields or push fields.


Scenario

You have used direct SQL commands in a client-side workflow and can face potential performance issues.

Recommendation:

After you get the necessary approvals from BMC, use the following steps to rewrite the workflows for performance and for handling the cases where SQL is approved to be used:

  1. Create a display-only form that includes all necessary operation fields and a keyword field to indicate the service name, allowing the form to accommodate multiple services.
  2. Create a filter guide for any service that requires multiple filters, and name it after the service that includes all relevant filters that trigger the service action based on its name.
  3. Create filters that execute the queries and any direct SQL actions within the guide.
  4. Configure a filter on the display-only form to execute the service operation, which will make a call to the filter guide.
  5. On the client side, create an active link that makes a service call to this display-only form using the service action.
  6. In the active link, pass in the required data for server-side operations and provide a return mapping for the data that should be set on the client side.

By executing this logic on the server, you significantly reduce browser-server communication, alleviating network latency issues. If you need to make SQL calls, make sure that your direct SQL statements are ANSI compatible, enabling them to run on all databases.

customization_scenario_1.png


  • Avoid using active links when making multiple calls to the server to retrieve data.
    You must perform queries by using filters on the server side. Make sure that the active link makes a service call to a form containing these filters and subsequently returns the results.

  • To improve active link performance, simplify the qualification for active links and combine active links that use the same qualification.
    This method is more efficient than designing two identical active links except for their Execute On selection. 

    For example, you might want your users to click a button or press Return to open a selection menu list. Design both of these Execute On actions in the same active link. To improve filter performance, simplify the qualification and combine filters that use the same qualification.

Best practices for workflows

Best practices for direct SQL commands in a filter workflow

Do not use direct SQL commands in filter workflows. Instead, consider rewriting the direct SQL so that it does not run during an update.

Incorporating direct SQL commands into workflows poses a risk because these commands might try to modify records that a transaction is already updating.  To avoid this risk, use a workflow pattern that prevents the workflow from running during an update.

Scenario

You have written a direct SQL command to bypass workflow.

Recommendation:

The following steps are an example of how to rewrite the direct SQL as a workflow to perform the same operation:

customization_scenario_2.png

  1. Use a filter push field to check if the workflow needs to be bypassed when doing the update.
  2. Use the existing keyword field, z1DAction or add a new display-only field on the target form to hold a keyword that is pushed from the filter, updating the record.
  3. On the target form, create a filter that is triggered at execution order 0.
  4. Set the display-only field to the appropriate keyword.
    The field should skip to the highest execution order plus 1. This operation allows the AR Server to communicate with the database and the workflow transaction and verifies whether you can rewrite the direct SQL command.


If direct SQL is necessary for a particular action, you must approve it as an exception and write it carefully by following these guidelines:

  • When you run the SQL statement on a PostgreSQL database, include the keyword PARENT TRANSACTION before your SQL statement.
    This keyword makes sure that your statement executes within the current transaction.

Review what and how often the data is updated. Avoid direct SQL statements in filter workflow because both the direct SQL statement and the workflow update the same record repeatedly, leading to record locking. Be careful with how the workflow is written and how the updates are made.
The following scenario is an example of a common problem:

Scenario

You have designed a workflow that uses direct SQL to update the last modified date of a parent change request when someone adds, deletes, or modifies an associated record. However, when someone performs a bulk action on the records associated with that change request, the system continuously tries to renew the change request.

However, it's not necessary to update the last modified date with every change; instead, update it just once within a 1-minute timeframe to allow another process to identify the changes.

Recommendation:

Adjust the direct SQL statements to trigger updates if the record hasn't been refreshed within the last minute. This approach has helped reduce unnecessary updates and database locks.

Best Practices for filter workflows running external scripts

BMC Helix SaaS does not support filter workflows that run scripts or processes on the server or save data on the server. If converting direct SQL to workflow is not possible, then you can convert the functionality into a plugin.

First, try to rewrite the direct SQL to function as a workflow. If you are unable to write the existing workflow constructs to meet the requirement, write a custom plugin or coded bundle to perform the actions in the context of the server. 

Use the following information to create a custom plugin container for your plugins:

Scenario

Custom plugins using Java code that are deployed in an on-premises environment in the existing pluginsvr process are not supported in the BMC Helix SaaS environment. 

Recommendation:

Deploy a custom container with custom Java plugins in BMC Helix SaaS with the help of BMC SaaS Operations team.

For more information, see Deploying custom plug-ins

Depending on the complexity, this task might take 3 weeks or more.


BMC Helix SaaS does not allow nonstandard customizations that call external scripts such as bat, shell, or Perl.

Use the following information to include this kind of customization:

Scenario

You have nonstandard customizations that call external scripts such as bat, shell, or Perl.

Recommendation:

Create an AI job, a web service, or an API call to include this kind of customization.

Depending on the complexity, this operation might take up to 7 days or more.

Best Practices for workflows making external calls

When creating transaction data and performing external, internal, or resource-intensive calls, you must run these external calls asynchronously to avoid delays in submitting or updating the ticket.

Sometimes, you might overlook that certain processing tasks might be resource-intensive or involve external calls, leading to workflow processing delays.

Examples of such tasks include setting fields that perform REST or WebService calls or making push fields or service calls that require significant processing.

Scenario

In one scenario, you initiate a custom notification process that adds 10 seconds to each ticket update due to in-line push fields triggering updates. Another scenario involves you using filters to set fields through REST APIs, which can add 5-10 seconds to each transaction. 

To mitigate these delays, move the processing operations outside the transaction, provided it doesn't require a complete transaction rollback. 
For BMC Helix ITSM, use a custom form for asynchronous processing over the SYS: Action form due to its high demand. 

We also recommend that you create a filter on the transaction form that pushes the record's data to the custom asynchronous processing form. Periodically, an escalation activates to initiate the required operation, with the workflow then executing the necessary actions. In the notification example, this approach is effective because it captures the notification data in the async transaction form and processes the notification later. 

In another scenario, let's assume you need consistent submission times for changes due to gateway limitations. The challenge is that the heavy use of change templates, which can generate potentially hundreds of tasks, makes the time to create changes variable. By default, task creation happens synchronously. This variability only poses a problem for changes made through integrations.

Add a field for integrations on the change interface form to mark task creation as asynchronous. This setup separates task processing, enabling the transfer of template and change information to a different form. The system generates tasks through an escalation on this form, maintaining task creation outside the main change transaction. The workflow halts change status updates until the completion of tasks, and an error handler reverts the change if task creation encounters failure.

When using set fields to call a REST API, the complexity depends on whether the REST call provides data essential for workflow processing or just informational data that can be updated later. In the earlier examples, it was just some informational data, so there was no issue with getting the data and updating the ticket later.


Best practices for escalations

Follow these guidelines to help you design efficient escalations:

  • Use the minimum number of escalations required for your workflow.

  • Run escalations with qualifications that use indexed fields when possible.
    For more information about indexed fields, see Table field indexing considerations.

  • Streamline your escalations by including all available criteria in the qualification.
    Unqualified escalations run against every record for the form, and might process some records unnecessarily.

  • Avoid running escalations during peak user load times.

  • Stagger long-running escalations in different pools to avoid overloading the system.

  • Avoid running long-running escalations on the default pool, which runs a lot of out of the box (OOTB) small escalations, which otherwise might get delayed.

  • Avoid running conflicting escalations (operating on the same data set) simultaneously in different pools.

  • Allow escalations the time they need to complete before the next escalation activates.
    An example is an escalation that searches the database for 30,000 requests but is set to execute every minute. Escalations are processed in sequence, and an escalation will not run until the escalation scheduled immediately prior to its runtime has been completed.

  • Use the escalation log to identify the times escalations run, how long they take to complete, and the types of actions your escalations perform.
    Remember that an escalation can modify a request. You can help maintain system performance by minimizing the impact of blocking operations. A blocking operation is an action performed during filter processing that waits for a DBMS or an external process to return the requested information. Blocking operations are caused by Set Fields filter actions, Push Fields filter actions, and $PROCESS$ actions that retrieve information from a DBMS or an external process.

  • Allow escalations to run against each change form individually, ensuring more manageable database transactions.

    The following guidelines will help you ensure more manageable database transactions:

    • Move the escalation layer down a level to mitigate large transactions and deep filter execution stacks.
      Updating a single record during escalations can lead to numerous updates across different levels. This process often results in large transactions and the execution of deep filter stacks, which can be inefficient and problematic.

    • Carefully consider data volume when implementing asynchronous processes to avoid issues. Ignoring data volume during these processes can lead to issues.
      For example, a process created for updating all planning status changes might trigger integrations through workflows that sometimes update other change requests. Typically, the system updates a single record, such as a SYS: Action record, activating a push field that updates all necessary changes.

    • However, this approach has drawbacks; a single error can cause the rollback of the entire transaction because all updates occur within one transaction. Additionally, it can overload the server with too many filters and lead to long database transactions, affecting performance.

Best practices for using custom database views

In BMC Helix SaaS, custom database views are not supported out of the box. To use custom database views, you must rewrite the custom database views by using a standard workflow. If you still want to use custom database views, you must get the necessary approvals from BMC for using a plugin to perform this action.

Custom SQL database views might operate differently depending on the database they originate from. Additionally, you might have used specialized stored procedures, which can lead to issues. Avoid utilizing custom database views unless there are no other methods to retrieve the necessary data.

Stored procedures and triggers are not allowed, and you must convert these procedures and triggers into workflows to occur within the platform context.

Reasons to use custom database (DB) views can be:

  • When you want to join ITSM tables with external tables.
  • When you need to create a complex join in a performant way.

Issues that can arise with custom database views are:

  • The version or database vendor used for the on-premises system might differ from the SaaS system, causing the view to operate differently or less efficiently.
  • Application-defined permissions to the data might be bypassed.
  • Data might not be indexed correctly, leading to performance issues.

Access each custom view to determine if there is a way to implement the same functionality within the platform itself.

Some best practices include:

  • Incorporating external tables as forms instead of as tables not managed in the platform.

    As part of SAP BO Analytics applications, BMC used to include ANA tables for converting enum tables and views during installation. In the database, selection values are stored as integers, and these views help you convert ENUM values into readable selection values when running select queries directly on the database for reporting applications or integrations.

    Any reports pointing to these tables will fail because these tables do not exist anymore.
    The tables are:

    • ANA_ALL_ENUM_VALUES
    • ANA_HPD_ENUM_VW
    • ANA_RKM_ENUM_VW
    • ANA_SLM_ENUM_VW
    • ANA_SRM_ENUM_VW
    • ANA_PBM_ENUM_VW
    • ANA_CHG_ENUM_VW

    Recommendation:

    Use a database procedure that can be triggered to update ENUM values when adding a new form or creating or updating selection values in an existing form.

    If you are using any of these table views, you must create a corresponding application form and a view form in BMC Helix ITSM via Developer Studio, similar to the table explained earlier.

    Keep the form name the same as the existing table name.

    Forms to be created:

    Regular Form:

    • ANA_ALL_ENUM_VALUES

    Join Form:

    • ANA_HPD_ENUM_VW
    • ANA_RKM_ENUM_VW
    • ANA_SLM_ENUM_VW
    • ANA_SRM_ENUM_VW
    • ANA_PBM_ENUM_VW

    Depending on the complexity, this operation might take up to 7 days or more.


  • Stored procedures and triggers are not allowed, and you must convert these procedures and triggers into workflows to occur within the platform context.

    Database views connected to an external database outside of the AR System and view forms created in BMC Helix ITSM based on these database views cannot be migrated as custom database objects in the BMC Helix SaaS system.

    Recommendation:

    Convert any custom database views created in an on-premises environment to AR System Regular forms.

    Create an AR form for a corresponding custom DB view, migrate the data by using an AI job, and adjust the workflow.

    Depending on the complexity, this task might take up to 5 days or more.


Best practices for custom join forms and custom fields

  • Take note of the following practices when you create join forms:

    • Do not create multiple layers of join forms.
      If you create multiple layers of join forms, you might see a decline in the speed of database and system performance.

    • Make sure that you have created joins on the indexed fields.
      Joins on non-indexed fields will slow system performance.

  • Maintain a minimum number of diary fields.
    Performance decreases when character field size exceeds 255 bytes (4000 bytes for the Oracle database). The impact on performance for a form increases with the number of diary fields. You can design most AR System applications effectively by using one or two diary fields.

  • If you maintain multiple form views with trim or control fields, do not duplicate screen objects unnecessarily and, when possible, share screen objects between views.
    The more screen objects you create (data fields, control fields, and trim), the larger your forms will be and the longer it takes to load, display, or switch to another view.

  • Avoid using many toolbar buttons with different bitmaps in multiple views; this also increases the form size.

  • If you need to include an image, use a JPEG instead of a bitmap.
    The file size is generally smaller for JPEG files, and the form will take less time to load.

  • Build custom joins and custom fields in a way that optimizes performance.
    Sometimes, you might create custom joins and fields for use in workflow, reports, integrations, etc.

    Some common issues when creating custom joins and fields include:
    • Custom joins are built without considering depth and indexing.
    • Lack of indexing on custom fields, resulting in slow queries.

  • Identify potential bottlenecks by using platform's server statistics, specifically the sections for Longest SQLs and Longest APIs, and strategize on improving their performance.
    The platform captures any SQL or API calls taking longer than 5 seconds in these sections, allowing you to pinpoint queries that run the longest. For Postgres and Oracle, obtain the query plans to help identify where you can add indexes to speed up queries.

  • Avoid using workflows that interact with the file system.

  • Do not run process commands to run scripts.

  • While interacting with FTP sites for data transfer is generally permitted, do not use the file system to write files within the application's business logic.
    Additionally, filters cannot run scripts on the file system.

  • Review the custom joins and forms to ensure that they are not unnecessarily FTS indexed.
    You might accidentally copy and paste from other forms that have FTS properties set, leading to fields being unnecessarily indexed. You must review and confirm whether a field needs to be FTS indexed, as unnecessary indexing adds extra load to the server.

Best practices for Set Fields and Push Fields actions

Avoid blocking operations when possible because they can affect all users, and blocking operations typically are not scalable. However, you might need to use blocking actions for some processes.

For more information about these actions, see:

The following section describes ways to minimize performance issues when using one of these action types.

  • Use filters instead of active links to perform Set Fields and Push Fields actions, especially if the active link Execute On condition is Submit or Modify.
    The advantage is that the server (filter) should perform the Set Fields action faster than the client (active link). 
    For example, an active link that performs a Set Fields action on submit pulls information from the server only to push that information back. Your system performance will improve if you use a filter to perform the Set Fields action on the server.

  • Use only efficient searches in these actions, especially if the workflow executes the search frequently.
    Efficient searches define where the system looks for the data (usually using an index). You can improve performance by designing actions to retrieve only the necessary columns. This practice is especially true when the excluded columns are diary fields or attachment fields. The biggest performance improvement, however, still depends on how well the search is defined.

  • Do not perform Set Fields actions in a filter if the user must see the data retrieved by the Set Fields action prior to the Submit or Modify operation or if the data retrieved by the Set Fields action depends on the client-based workflow.

  • Limit the use of Set Fields and Push Fields actions that include database searches or other external blocking actions.

  • To improve $PROCESS$ action performance, you must have one $PROCESS$ action execute one resource-demanding command and return the results to a temporary field
    The remaining actions can retrieve and parse data from the temporary field. For example, if you set five fields, write this data to a temporary field with the first $PROCESS$ operation and have the remaining actions retrieve the data from the local field.

  • A better solution is redesigning the process to use the Filter API. 
    For more information, see Developing an API program. This technique uses one long-running process, making it more efficient and significantly faster.

 

Tip: For faster searching, add an asterisk to the end of your partial query. Example: cert*