Performance benchmarks and tuning
The following key metrics were captured during benchmark execution:
- Measure response time under different user loads
- Measure memory and CPU utilization
- Determine bottlenecks for the performance improvement
- Review sizing guidelines
Performance testing environments
Deployment for a 24-hour run
The following diagram shows the deployment of the lab environment:
The following table provides information about the hardware used for the performance test:
Machine usage | Operating system | Application server | CPU | Memory |
---|---|---|---|---|
Jmeter Client 5.5 | Windows 10 Enterprise | NA | 2 CPU | 8 GB |
Application server and database server | Windows Server 2022 SQL Server 2019 | Track-It! .NET v4.0 | 8 CPU | 16 GB |
Deployment for a 7-day endurance test
The following diagram shows the deployment of the lab environment for 7 days [24X7] endurance runs:
The following table provides information about the hardware used for the performance test:
Machine usage | Operating system | Application server | CPU | Memory |
---|---|---|---|---|
Jmeter Client 5.5 | Windows 10 Enterprise | NA | 2 CPU | 8 GB |
Application server and database server | Windows Server 2019 SQL Server 2019 | Track-It! .NET v4.0 | 8 CPU | 16 GB |
Performance test used
The product performance test was executed with JMeter [.jmx] scripts.
The following basic operations were performed from the technician portal:
- Create tickets or assignments: Each technician created tickets or assignments at a frequency of approximately 7 to 8 tickets or assignments in an hour.
- Update tickets or assignments: each technician updated approximately 7 to 8 tickets or assignments in an hour.
- Add additional note to tickets or assignments: each technician updated approximately 7 to 8 tickets or assignments in an hour.
- Open tickets or assignments: each technician opened and viewed approximately 7 to 8 tickets or assignments in an hour.
- Search and link solutions to tickets or assignments: each technician searched and linked solutions to approximately 7 to 8 tickets or assignments in an hour.
- Create purchase order: Each technician created purchase orders at a frequency of approximately 7 to 8 purchase orders in an hour.
- Add a new master item to an existing purchase order: each technician updated approximately 7 to 8 purchase orders in an hour.
- Receive purchase item and create assets: each technician created approximately 7 to 8 assets in an hour.
- Update asset: each technician updated approximately 7 to 8 assets in an hour.
The following basic operations were performed from the Self Service portal:
- Create ticket: each requestor created tickets at a frequency of approximately 8 to 9 tickets in an hour.
- Open or view ticket: each requestor opened and viewed approximately 8 to 9 tickets in an hour.
- Search solutions: each requestor searched approximately 8 to 9 solutions in an hour
All the above operations were executed in parallel with individual end users and the initial ramp-up time was maintained.
Example
Assume that the performance run was executed by four technicians and four requestors who were performing the following actions:
- One technician is creating a ticket.
- One technician is updating or adding a ticket note.
- One technician is opening or viewing a ticket.
- One technician is creating or updating a purchase order.
- One technician is updating an asset.
- In Self Service, two requestors are creating tickets.
- Three requestors are opening or viewing tickets.
User scenarios for 24-hour run
Before starting the 24-hour environment runs, the following volume data was generated on the server by using SQL scripts:
- Tickets: 50000
- Tickets with linked assignments: 50000
- Ticket notes: 100000
- Ticket attachments: 50000
- Purchase orders: 20000
The following user scenarios were executed as a part of 24-hours performance test:
Users | Actions |
---|---|
50 technicians 200 requestors |
|
User scenarios for 7-day endurance test
The following user scenarios were executed as a part of 7 days endurance test:
Users | Actions |
---|---|
50 technicians 100 requestors |
|
Performance reports
24-hour run
Portal | Action | Response time in seconds (approximately) |
---|---|---|
Self Service | Log in | 2.5 to 3.5 |
Self Service | Create ticket | 1.0 to 1.5 |
Self Service | Open ticket | 1.0 to 1.5 |
Technician portal | Create ticket [Detail section enabled] | 3.0 to 4.0 |
Technician portal | Open ticket [Detail section enabled] | 4.0 to 5.0 |
Technician portal | Update ticket [Detail section enabled] | 2.5 to 3.5 |
Technician portal | Add ticket note | 4.0 to 6.0 |
Technician portal | Create purchase order | 2.5 to 3.5 |
Technician portal | Update purchase order | 2.5 to 3.5 |
Technician portal | Add Purchase item | 2.5 to 3.5 |
Technician portal | Receive purchase item | 2.5 to 3.5 |
Technician portal | Open asset | 2.5 to 3.5 |
Technician portal | Update asset | 2.5 to 3.5 |
Performance tuning
Following are the checks for better performance tuning:
- For better performance on Ticket and Assignment forms, the Create, Update, and Open the Details section is disabled by default.
- For better performance on Helpdesk, during loading, keep the default user view with single level queries, such as, My Open tickets. Avoid views with multilevel queries such as, All works and also disable the Preview section.
- For better performance on Dashboard loading, keep minimum required panels with view with single level queries such as My Open tickets and avoid views with multilevel queries such as All works.
- Do not change the default log level of the Track-It! application from the LogConfig utility.
Database tuning
- Ensure no additional loads, such as SQL profiler are running on the SQL server.
- Do not alter the default values of Open Data Base Connectivity (ODBC) or SQL connection settings.
- Run only the required services on the SQL server.
- If the Track-It! application is installed in a split environment, try to keep both servers in the same domain (recommended latency ~1-3 millisecond).