Language availability

   

Track-It! 2020 Release 03 online technical documentation is also available in the following languages:

  • French
  • German
  • Portuguese (Brazil)
  • Spanish
  • Spanish (XL)

The displayed language depends on your browser language. However, you can change languages from the Language menu.

Performance benchmarks and tuning

This topic contains information about performance testing, environments, performance and database tuning.

Summary of performance tests  

Performance testing for the Track-It! application was executed by using the JMeter tool. The objective of the performance test was to demonstrate the response time, CPU and memory consumption, and scalability characteristics of the Track-It! application for customers. No blocker issues were observed during the runs.

The following key metrics were captured during benchmark execution:

  • Measure response time under different user loads
  • Measure memory and CPU utilization
  • Determine bottlenecks for the performance improvement
  • Review sizing guidelines

Performance testing environments

Deployment for a 24-hour run

The following diagram shows the deployment of the lab environment for a small and large environment:

The following table provides information about the hardware used for the performance test:

Machine usage

Operating system

Application server
or
database server

CPU

Memory

Environment Size

Jmeter Client 3.0

Microsoft Windows 7

NA

2 CPU

8 GB

Small

Application server and database server

Windows Server 2012 R2

Track-It!

.NET v4.0

2 CPU

8 GB

Small

Jmeter Client 3.0

Windows 7

NA

2 CPU

8 GB

Medium

Application server and database server

Windows Server 2012 R2

Track-It!

.NET v4.0

4 CPU

16 GB

Medium

Deployment for a 7-day endurance test

The following diagram shows the deployment of the lab split small environment for 7 days [24X7] endurance runs:

The following table provides information about the hardware used for the performance test on a small environment:

Machine usage

Operating system

Application server
or
database server

CPU

Memory

Jmeter Client 3.0

Microsoft Windows 7

NA

2 CPU

8 GB

Application Server

Windows Server 2012R2

Track-It!

.NET v4.0

2 CPU

8 GB

Database Server

Windows Server 2012R2

SQL Server 2012

2 CPU

8 GB

Performance test used

The product performance test was executed with JMeter [.jmx] scripts using basic operations, such as create tickets/assignments, update tickets/assignments, add additional note to tickets/assignments, open tickets/assignments, and search and link solutions to tickets/assignments from the technician portal. Also, from the Self Service portal, create ticket, view tickets, and search for solutions actions were performed.

  • Create ticket/assignment action from the Technician portal: each technician created tickets/assignments at a frequency of approximately 7 to 8 tickets/assignments in an hour.
  • Update or add note to ticket/assignment action from the Technician portal: each technician updated approximately 7 to 8 tickets/assignments in an hour.
  • Open or view ticket/assignment action from Technician portal: each technician opened and viewed approximately 7 to 8 tickets/assignments in an hour.
  • Search and link solution to ticket/assignments action from Technician portal: each technician searched and linked solutions to approximately 7 to 8 tickets/assignments in an hour.
  • Create ticket action from Self Service: each requestor created tickets at a frequency of approximately 8 to 9 tickets in an hour.
  • Open or view ticket action from Self Service: each requestor opened and viewed approximately 8 to 9 tickets in an hour.
  • Search solutions from Self Service: each requestor searched approximately 8 to 9 solutions in an hour

All the above operations were executed in parallel with individual end users and the initial ramp-up time was maintained.

Example

Assume that the performance run was executed by four technicians and four requestors who were performing the following actions:

  • One technician is creating a ticket.
  • One technician is updating or adding a ticket note.
  • One technician is opening or viewing a ticket.
  • In Self Service, two requestors are creating tickets.
  • Three requestors are opening or viewing tickets.

Notes

  • While the above run was in progress, the manual response time against actions like create, update, or view tickets/assignments and search and link solutions to tickets/assignments on the Technician portal and create or view tickets and search solutions in Self Service were noted using the Http Watch tool.
  • All the runs were executed with latency of 1 to 3 milliseconds.

User scenarios for 24-hour run

Before starting the 24-hour small environment runs, the following volume data was generated on the server by using SQL scripts:

  • Tickets – 5000
  • Tickets with linked assignments – 5000

Before starting the 24 hours large environment runs, following volume data was generated on server using SQL scripts:

  • Tickets created -10000
  • Tickets with linked assignments created – 10000

The following user scenarios were executed as a part of 24-hours performance test:

UsersActions

3 technicians

50 requestors

1 technician creating tickets

1 technician viewing tickets

1 technician adding a note to ticket.

25 requestors creating tickets

25 requestors viewing tickets

5 technicians

100 requestors

2 technicians creating tickets

2 technicians viewing tickets

1 technician adding a note to ticket

50 requestors creating tickets

50 requestors viewing tickets

10 technicians

150 requestors

4 technicians creating tickets

4 technicians viewing tickets

2 technicians adding a note to ticket

75 requestors creating tickets

75 requestors viewing tickets

20 technicians

200 requestors

8 technicians creating tickets

8 technicians viewing tickets

4 technicians adding a note to ticket

100 requestors creating tickets

100 requestors viewing tickets

30 technicians

200 requestors

12 technicians creating tickets

12 technicians viewing tickets

6 technicians adding a note to ticket

100 requestors creating tickets

100 requestors viewing tickets

User scenarios for 7-day endurance test

The following user scenarios were executed as a part of 7 days endurance test:

UsersActions

3 technicians

30 requestors

1 technician creating a ticket

1 technician opening a ticket

1 technician updating or adding a note to a ticket.

15 requestors creating a ticket 

15 requestors opening a ticket.

5 technicians

100 requestors

2 technicians creating a ticket

2 technicians opening a ticket

1 technician updating or adding a note to a ticket

50 requestors creating a ticket

50 requestors opening a ticket.

10 technicians

150 requestors

4 technicians creating a ticket

4 technicians opening a ticket

2 technicians updating or adding a note to a ticket

75 requestors creating a ticket

75 requestors opening a ticket.

Performance reports

Small environment

PortalActionResponse time in seconds (approximately)
Self ServiceLog in2.5 to 3.5
Self ServiceCreate ticket1.0 to 1.5
Self ServiceOpen ticket1.0 to 1.5
Technician portalCreate ticket [Detail section enabled]3.0 to 4.0
Technician portalCreate ticket [Detail section disabled]0.5 to 1.5
Technician portalOpen ticket [Detail section enabled]4.0 to 5.0
Technician portalOpen ticket [Detail section disabled]1.5 to 2.0
Technician portalUpdate ticket [Detail section enabled]2.5 to 3.5
Technician portalUpdate ticket [Detail section disabled]0.5 to 1.5
Technician portalAdd ticket note4.0 to 6.0

Medium environment

 Portal

Action

Response time in seconds (approximately)

Self Service

Log in

1.5 to 3.0
Self ServiceCreate ticket0.5 to 1.5
Self ServiceOpen ticket0.5 to 1.5
Technician portalCreate ticket [Detail section enabled]3.0 to 4.0
Technician portalCreate ticket [Detail section disabled]0.5 to 1.0
Technician portalOpen ticket [Detail section enabled]4.0 to 5.0
Technician portalOpen ticket [Detail section disabled]1.5 to 2.0
Technician portalUpdate ticket [Detail section enabled]2.5 to 3.5
Technician portalUpdate ticket [Detail section disabled]0.5 to 1.0
Technician portalAdd ticket note4.0 to 6.0

Performance tuning

Following are the checks for better performance tuning:

  • For better performance on Ticket and Assignment forms, the Create, Update, and Open the Details section is disabled by default.
  • For better performance on Helpdesk, during loading, keep the default user view with single level queries, such as, My Open tickets. Avoid views with multilevel queries such as, All works and also disable the Preview section.
  • For better performance on Dashboard loading, keep minimum required panels with view with single level queries such as My Open tickets and avoid views with multilevel queries such as All works.
  • Do not change the default log level of the Track-It! application from the LogConfig utility.

Database tuning

  • Ensure no additional loads, such as SQL profiler are running on the SQL server.
  • Do not alter the default values of Open Data Base Connectivity (ODBC) or SQL connection settings.
  • Run only the required services on the SQL server.
  • If the Track-It! application is installed in a split environment, try to keep both servers in the same domain (recommended latency ~1-3 millisecond).



Was this page helpful? Yes No Submitting... Thank you

Comments