MyIT Service Broker test methodology
This topic provides the following information:
Test case
End-user response times were captured for key actions with 100ms latency in a test case that was simulated for workload of 500 concurrent users with Single BMC Service Broker Server
The following table lists key actions for which end-user response time was captured:
Application | Actions |
|---|---|
MyIT with Service Broker (SB) |
|
Single user response times were captured for key actions for different types of Service Broker Admin users under no load at 100 ms latency using HttpWatch.
Application | Type of User | Page | Action |
|---|---|---|---|
MyIT Service Broker Admin Console | Admin | Login |
|
|
| Services |
|
|
| Categories |
|
|
| Entitlements |
|
|
| User Roles |
|
|
| Banners |
|
|
| Templates |
|
Workload
The nominal workload environment was defined by the distribution of concurrent users and transaction rates among the test scenarios. This workload was used as the baseline for consistent benchmarking of the performance and scalability of the MyIT Service Broker (SB) application.
The following table describes the workload distribution for MyIT Service Broker test scenarios:
The following table describes the projected executions for MyIT Service Broker (SB) with given concurrent users in an hour
Below are the number of entries created during a 500 concurrent-user loads per hour
Data volume
The following table summarizes the base application data volume in the BMC Remedy AR System and Service Broker database prior to starting the tests:
The following table summarizes the Service Broker database data volume prior to starting the tests:
The following table summarizes the Social data volume (Mongo DB) prior to starting the tests: