How to Performance Test an Application – LoadRunner Training Tutorial Part 2

This is the 2nd tutorial in our Performance testing with LoadRunner training series. With this, we are learning the exact performance test process so that we can easily get hold of the Load Testing with HP LoadRunner tutorials.

Check out the first tutorials in this series here: Performance Testing Introduction.

Performance Testing Goals:

It is conducted to accomplish the following goals:

  • Verify Application’s readiness to go live.
  • Verify if the desired performance criteria are met
  • Compare performance characteristics/configurations of the application to what is standard
  • Identify Performance Bottlenecks.
  • Facilitate Performance Tuning.

Key Activities in Performance Testing:

Performance Test workflow

#1. Requirement Analysis/Gathering

Performance team interacts with the client for identification and gathering of requirement – technical and business. This includes getting information on application’s architecture, technologies and database used, intended users, functionality, application usage, test requirement, hardware & software requirements etc.

#2. POC/Tool selection

Once the key functionality is identified, POC (proof of concept – which is a sort of demonstration of the real-time activity but in a limited sense) is done with the available tools. The list of available performance test tools depends on the cost of the tool, protocol that application is using, the technologies used to build the application, the number of users we are simulating for the test, etc. During POC, scripts are created for the identified key functionality and executed with 10-15 virtual users.

#3. Performance Test Plan & Design

Depending on the information collected in the preceding stages, test planning and designing are conducted.

Test Planning involves information on how the performance test is going to take place – test environment the application, workload, hardware, etc.

Test designing is mainly about the type of test to be conducted, metrics to be measured, Metadata, scripts, number of users and the execution plan.

During this activity, a Performance Test Plan is created. This serves as an agreement before moving ahead and also as a roadmap for the entire activity. Once created this document is shared with the client to establish transparency on the type of the application, test objectives, prerequisites, deliverable, entry and exit criteria, acceptance criteria etc.

Briefly, a performance test plan includes:

a) Introduction (Objective and Scope)
b) Application Overview
c) Performance (Objectives & Goals)
d) Test Approach (User Distribution, Test data requirements, Workload criteria, Entry & Exit criteria, Deliverable, etc.)
e) In-Scope and Out-of-Scope
f) Test Environment (Configuration, Tool, Hardware, Server Monitoring, Database, test configuration, etc.)
g) Reporting & Communication
h) Test Metrics
i) Role & Responsibilities
j) Risk & Mitigation
k) Configuration Management

#4. Performance Test Development

  • Use cases are created for the functionality identified in the test plan as the scope of PT.
  • These use cases are shared with the client for their approval. This is to make sure the script will be recorded with correct steps.
  • Once approved, script development starts with a recording of the steps in use cases with the performance test tool selected during the POC (Proof of Concepts) and enhanced by performing Correlation (for handling dynamic value), Parameterization (value substitution) and custom functions as per the situation or need. More on these techniques in our video tutorials.
  • The Scripts are then validated against different users.
  • Parallel to script creation, performance team also keeps working on setting up of the test environment (Software and hardware).
  • Performance team will also take care of Metadata (back-end) through scripts if this activity is not taken up by the client.

#5. Performance Test Modeling

Performance Load Model is created for the test execution. The main aim of this step is to validate whether the given Performance metrics (provided by clients) are achieved during the test or not. There are different approaches to create a Load model. “Little’s Law” is used in most cases.

#6. Test Execution

The scenario is designed according to the Load Model in Controller or Performance Center but the initial tests are not executed with maximum users that are in the Load model.

Test execution is done incrementally. For example: If the maximum number of users is 100, the scenarios are first run with 10, 25, 50 users and so on, eventually moving on to 100 users.

#7. Test Results Analysis

Test results are the most important deliverable for the performance tester. This is where we can prove the ROI (Return on Investment) and productivity that a performance testing effort can provide.

Loadrunner Test Results Analysis

Some of the best practices that help the result analysis process:

a) A unique and meaningful name to every test result – this helps in understanding the purpose of the test
b) Include the following information in the test result summary:

  • Reason for the failure/s
  • Change in the performance of the application compared to the previous test run
  • Changes made in the test from the point of application build or test environment.
  • It’s a good practice to make a result summary after each test run so that analysis results are not compiled every time test results are referred.
  • PT generally requires many test runs to reach the correct conclusion.
  • It is good to have the following points in result summary:
    • Purpose of test
    • Number of virtual users
    • Scenario summary
    • Duration of test
    • Throughput
    • Graphs
    • Graphs comparison
    • Response Time
    • Error occurred
    • Recommendations

There might be recommendations for configuration changes for the next test. Server logs also help in identifying the root cause of the problem (like bottlenecks) – Deep Diagnostic tools are used for this purpose.

In the final report, all the test summaries are consolidated.

#8. Report

Test results should be simplified so the conclusion is clearer and should not need any derivation. Development Team needs more information on analysis, comparison of results, and details of how the results were obtained.

The test report is considered to be good if it is brief, descriptive and to the point.

The following guidelines will smooth this step out:

  • Use appropriate heading and summary
  • The report should be presentable so that it can be used in the management meetings.
  • Provide supporting data to support the results.
  • Give meaningful names to the table headers.
  • Share the status report periodically, even with the clients
  • Report the issues with as much information and evidence as possible in order to avoid unnecessary correspondence

The final report to be shared with the client has the following information:

  • Execution Summary
  • System Under test
  • Testing Strategy
  • Summary of test
  • Results Strategy
  • Problem Identified
  • Recommendations

Along with the final report, all the deliverable as per test plan should be shared with the client.


We hope this article has given a process-oriented, conceptual and detailed information about how performance testing is carried out from beginning to end.

In next Performance testing tutorial, we will provide you with the list of all LoadRunner video tutorials with practical load testing examples. It’s important. Don’t miss.

See Also => HP LoadRunner in-depth Video Tutorials

Please post your comments and questions below.

24 thoughts on “How to Performance Test an Application – LoadRunner Training Tutorial Part 2”

  1. I just started working as a performance test analyst and found this article useful. where I work we use Silk Performer and DynaTrace to conduct our performance test. I am still learning this tools and how to analyse the result they generate. I also still need to understand the load model and benchmarking.

    I believe the loadrunner tutorial will be as useful.

    Thanks for this great work.

  2. Hi

    Can you please explain the how to do performance monitoring
    Do we need to use any extra tool for that or else controller is enough?


  3. @Chinna krishna redyy- There are many tools available for performance monitoring. Another option which can be tried when monitoring tool is not available is to get the perfmon i.e.Performance Monitor of the servers for the duration of the test run and import the same in through Analysis. LoadRunner is capable to generates the graphs from the perfmon. Any body who can access the servers can easily provides perfmon.

  4. Thanks for the article.. Very useful…

    Can you suggest any free ware tool for testing of the webservice and xml’s.
    For functional Testing we are using SOAP ui.

  5. Hi,
    Really interesting Article which is very clear and real time.
    @chetan: could you please add some idea about Performance testing in a agile methodology?

  6. hii chetan sir i am a manual tester and now i want to learn performance testing .can u plz tell me carrer in pt should i go for this or not.

  7. Hello, I am the beginner in Load runner. Could anyone let me know the best tutorial and how to use Load runner and what version we have to install.

  8. Hi,

    Let us say, we have noticed from the execution reports that, the given application is crashing at 800 VUs. What Hardware or software tuning, we need to do to increase the capacity to 2000 VUs? First of all, who has to handle this performance engineering or tuning role at Server or Database level?

Leave a Comment