May 12, 2016

LoadRunner Part # 7 - Performance Testing Roadmap

Performance Testing Roadmap: Detailed Steps

Performance Testing Roadmap can be broadly divided into 5 steps:
  • Planning for Load Test
  • Create VUGen Scripts
  • Scenario Creation
  • Scenario Execution
  • Results Analysis (followed by system tweaking)
Now that you've LoadRunner installed, let's understand the steps involved in the process one by one.




Steps involved in Performance Testing process

Planning for the Load Test

Planning for Performance Testing is different from planning an SIT (System Integration Testing) or UAT (User Acceptance Testing). Planning can be further divided into small stages as described below:

Assemble Your Team

When getting started with Performance Testing, it is best to document who will be participate in the activity from each team involved during the process.
Project Manager:
Nominate the project manager who will own this activity and serve as point person for escalation.
Function Expert/ Business Analyst:
Provide Usage Analysis of SUL & provides expertise on business functionality of website/SUL
Performance Testing Expert:
Creates the automated performance tests and executes load scenarios
System Architect:
Provides blueprint of the SUL
Web Developer and SME:
  • Maintains website & provide monitoring aspects
  • Develops website and fixes bugs
System Administrator:
  • Maintains involved servers throughout testing project

Outline applications and Business Processes involved:

Successful load testing requires that you plan to carry out certain business process. A Business Process consists of clearly defined steps in compliance to desired business transactions - so as to accomplish your load testing objectives.
A requirements metric can be prepared to elicit user load on the system. Below is an example from an attendance system in a company:
In the above example, the figures mention the number of users connected to the application (SUL) at given hour. We can extract the maximum number of users connected to a business process at any hour of the day which is calculated in the right most columns.
Similarly, we can conclude the total number of users connected to the application (SUL) at any hour of the day. This is calculated in the last row.
The above 2 facts combined give us the total number of users with which we need to test the system for performance.

Define Test Data Management Procedures

Statistics and observations drawn from Performance Testing are greatly influenced by numerous factors as briefed earlier. It is of critical significance to prepare Test Data for Performance Testing. Sometimes, a particular business process consumes a data set and produces a different data set. Take below example:
  • A user 'A' creates a financial contract and submits it for review.
  • Another user 'B' approves 200 contracts a day created by user 'A'
  • Another user 'C' pays  about 150 contracts a day approved by user 'B'
In this situation, User B need to have 200 contracts 'created' in the system. Besides, user C needs 150 contracts as "approved" in order to simulate load of 150 users.
This implicitly means, that you must create at least 200+150= 350 contracts.
After that, approve 150 contracts to serve as Test data for User C - the remaining 200 contracts will serve as Test Data for User B.
Outline Monitors
Speculate each and every factor which could possible affect the performance of system. For example, having reduced hardware will have potential impact on the SUL(System Under Load) performance.
Enlist all factors and setup monitors so you can gauge them. Here are few examples:
  • Processor    (for Web Server, Application Server, Database Server and Injectors)
  • RAM   (for Web Server, Application Server, Database Server and Injectors)
  • Web/App Server (for example IIS, JBoss, Jaguar Server, Tomcat etc)
  • DB Server (PGA and SGA size in case of Oracle and MSSQL Server, SPs etc.)
  • Network bandwidth utilization
  • Internal and External NIC in case of clustering
  • Load Balancer (and that it is distributing load evenly on all nodes of clusters)
  • Data flux   (calculate how much data moves to and from client and server - then calculate if capacity of NIC is sufficient to simulate X number of users)

Create VUGen Scripts

Next step after planning is to create VUser scripts.

Scenario Creation

Next step is to create your Load Scenario

Scenario Execution

Scenario execution is where you emulate user load on the server by instructing multiple VUsers to perform tasks simultaneously.
You can set the level of load by increasing and decreasing the number of VUsers that perform tasks at the same time.
This execution may result the server to go under stress and behave abnormally. This is the very purpose of the performance Testing. The results drawn are then used for detailed analysis and root cause identification.

Results Analysis (followed by system tweaking)

During scenario execution, LoadRunner records the performance of the application under different loads. The statistics drawn from test execution are saved and details analysis is performed. The 'HP Analysis' tool generates various graphs which help in identifying the root causes behind lag of system performance, as well as system failure.
Some of the graphs obtained include:
  • Time to First buffer
  • Transaction Response Time
  • Average Transaction Response Time
  • Hits Per Second
  • Windows Resources
  • Errors Statistics
  • Transaction Summary

FAQ

Which Applications should we Performance Test ?

Performance Testing is always done for client server based systems only. This means, any application which is not a client-server based architecture, must not require Performance Testing.
For example, Microsoft Calculator is neither client-server based nor it runs multiple users; hence it is not a candidate of Performance Testing.

What is the difference between Performance Testing & Performance Engineering

It is of significance to understand the difference between Performance Testing and Performance Engineering. An understanding is shared below:
Performance Testing is a discipline concerned with testing and reporting the current performance of a software application under various parameters.
Performance engineering is the process by which software is testedandtuned with the intent of realizing the required performance. This process aims to optimize the most important application performance trait i.e. user experience.
Historically, testing and tuning have been distinctly separate and often competing realms. In the last few years, however, several pockets of testers and developers have collaborated independently to create tuning teams. Because these teams have met with significant success, the concept of coupling performance testing with performance tuning has caught on, and now we call it performance engineering.

3 comments: