Monday, December 1, 2008

Chapter 2: Performance Testing – A Quick Overview

Software Performance Testing Handbook

A Comprehensive Guide for Beginners




Performance Management


Reactive Approach
The Performance testing activity is often considered as a reactive way of performance management. In most of the cases, the system performance is never thought of during the early phases of Software Development Life Cycle phases (SDLC). Performance Testing is often thought of only as a last activity after the system testing phase.

Also, if the performance bottlenecks are related to the system architecture or the system design, then it becomes highly impossible to fix the issues due to the high COPQ (Cost of Poor Quality) and in certain cases, the system is put into trash because of the huge deviations in the performance constraints.

Basically waiting for the performance problems to appear and then dealing with it at the end is always not a better approach. Hence performance testing is considered as a reactive approach as there is not much importance given to the system during early life cycle phases. It is more a ‘fix-it-later’ approach which is not that effective.

Proactive Approach
The Proactive approach anticipates the performance problems well in advance and adopts techniques to mitigate them. The importance of performance is thought about during all the SDLC phases right from the requirement analysis phase and various performance engineering activities are identified for the system.

The disadvantages of ‘fix-it-later’ approach are well understood and engineering practices are adopted to analyze the system design in performance angle. As the system is evaluated for the performance right from the design phase, the chances of last minute surprises is very less in proactive approach.

Application Performance Management

Application Performance Management (APM) is about managing the performance of the application throughout its lifecycle to ensure availability and better performance to the end users of the system. It forms a closed loop of performance management by managing the performance in development, testing and production stages of the application. Lot of production monitoring and diagnostics tools are available in the market to identify the performance problems of the production system and to provide quick diagnostics to resolve the performance problems. It also provides lot of inputs to carry out the pre-production performance testing. Some tools provide flexibility to use the same test assets (scripts, etc) for both pre-production and post-production performance testing.

Myths about Performance Testing

Organizations generally like to make the headlines with their success stories and not about the failure web sites due to unexpected user load. Industry study reveals that poor performing IT applications cost industrialized nations almost $45 billion annually. Most of the organizations understand the importance of performance testing and the impact of bypassing it. ‘How much time would it take to do the performance testing of an application?’ is a million dollar question for all organizations wherein the Software Performance Engineering (SPE) is not in place from the start of the project. Most of the organizations plan for the performance testing of the applications towards the end of the project, though it is of great importance to decide whether the performance bottlenecks are addressed before the go-alive dates in order to meet the end user demand on the production systems.

The following are the some of the popular myths about Performance Testing.

MYTH 1 : Performance testing is the last activity thought of only based time availability.
FACT : The importance to the system performance should be thought from the requirements identification phase and performance engineering activities needs to be practiced during each phase of SDLC.

MYTH 2 : Conducting Performance testing would increase the system performance irrespective of implementing the recommendations.
FACT : Conducting Performance test alone will not improve the system performance. It helps to identify whether the system meets the performance test goals and identify the performance bottlenecks of the system.

MYTH 3 : Performance Testing is just doing code profiling or memory profiling to tune the code.
FACT : Performance Testing is about evaluating the system for its conformance to the performance test goals and thereby identifying the performance bottlenecks in the software and hardware of the system.

MYTH 4 : Performance Testing needs to be done on all the functional flows of the application to identify performance issues.
FACT : Performance Testing needs to be done for specific scenarios based on Pareto analysis. The scenarios that are used often, which are of stakeholder’s concern, high critical scenarios, scenarios which are considered error prone are the right candidate for conducting performance testing.

MYTH 5 : The response time goal should be per the industry standards.
FACT : There is no industry standard for response time. The response time goal needs to be derived based on the hardware and software capacity available for the system considering the end users tolerance limit.


MYTH 6 : Instead of investing in performance testing activities, it is better to procure high capacity hardware as it is cheap.
FACT : Industry studies show that the hardware price is improving at around 40% per annum whereas the demand for IT resources is increasing at around 60% per annum. The reason for the actual performance issue needs to be identified through performance testing activity.

MYTH 7 : Performance Testing can be planned in parallel during the functional testing.
FACT : Performance Testing needs to be planned on a stable system only after the completion of system testing phase.

MYTH 8 : An application needs performance testing once in its life time irrespective of how many modules of the application are revamped over a period of time.
FACT : Performance testing needs to be best done as a continuous process of measuring, analyzing and tuning the system performance. The system performance needs to be re validated whenever the code is changed or added in the system or if there is a hardware change.

MYTH 9 : The Performance bottlenecks can be identified by conducting one test run.
FACT : The Performance bottlenecks cannot be identified by conducting one round of test. The isolation of bottlenecks becomes very tough in complex systems. Multiple rounds of isolation tests need to be run in order to identify the performance bottlenecks depending upon the type of bottleneck.

MYTH 10 : Performance Testing can be done only if the system is properly tuned.
FACT : Performance Testing can be done on any system to identify the bottlenecks. Irrespective of the system condition, performance testing can be conducted.

MYTH 11 : The Performance Tester is solely responsible for detecting and tuning the system for performance.
FACT : It is often a joint effort to detect the performance bottleneck and tune the system to meet its performance goals. Advices from Subject matter experts, DBAs and System administrators become a value add to identify/isolate the bottleneck easily.

MYTH 12 : Performance Testing can be carried out only on the development / test environment.
FACT : Performance Testing needs to be planned on an isolated environment in order to isolate the performance bottlenecks easily.

Performance Process Maturity Model

Michael Maddox proposes five levels of the maturity model for the performance test process. All the levels have the following common characteristics:

1. The levels are cumulative. The performance activities and processes practiced at level 2 are retained and enhanced at level 3 and so on through higher levels.
2. Different applications may exhibit different maturity levels.
3. Some level of learning and feedback is applied as work progresses. Organizations at higher levels of maturity apply more effective, more strategic feedback.

No comments: