How to Performance Test an Application – LoadRunner Training Tutorial Part 2

This is the 2nd tutorial in our Performance testing with LoadRunner training series. With this, we are learning the exact performance test process so that we can easily get hold of the Load Testing with HP LoadRunner tutorials.

Check out the first tutorials in this series here: Performance Testing Introduction.

Performance Testing Goals:

It is conducted to accomplish the following goals:

  • Verify Application’s readiness to go live.
  • Verify if the desired performance criteria are met
  • Compare performance characteristics/configurations of the application to what is standard
  • Identify Performance bottlenecks.
  • Facilitate Performance Tuning.

Key Activities in Performance Testing:

Performance Test workflow

#1. Requirement Analysis/Gathering

Performance team interacts with the client for identification and gathering of requirement – technical and business. This includes getting information on application’s architecture, technologies and database used, intended users, functionality, application usage, test requirement, hardware & software requirements etc.

#2. POC/Tool selection

Once the key functionality are identified, POC (proof of concept – which is a sort of demonstration of the real time activity but in a limited sense) is done with the available tools. The list of available performance test tools depends on cost of tool, protocol that application is using, the technologies used to build the application, the number of users we are simulating for the test, etc. During POC, scripts are created for the identified key functionality and executed with 10-15 virtual users.

#3. Performance Test Plan & Design

Depending on the information collected in the preceding stages, test planning and designing is conducted.

Test Planning involves information on how the performance test is going to take place – test environment the application, workload, hardware, etc.

Test designing is mainly about the type of test to be conducted, metrics to be measured, Metadata, scripts, number of users and the execution plan.

During this activity, a Performance Test Plan is created. This serves as an agreement before moving ahead and also as a road map for the entire activity. Once created this document is shared to the client to establish transparency on the type of the application, test objectives, prerequisites, deliverable, entry and exit criteria, acceptance criteria etc.

Briefly, a performance test plan includes:

a) Introduction (Objective and Scope)
b) Application Overview
c) Performance (Objectives & Goals)
d) Test Approach (User Distribution, Test data requirements, Workload criteria, Entry & Exit criteria, Deliverable, etc.)
e) In-Scope and Out-of-Scope
f) Test Environment (Configuration, Tool, Hardware, Server Monitoring, Database, test configuration, etc.)
g) Reporting & Communication
h) Test Metrics
i) Role & Responsibilities
j) Risk & Mitigation
k) Configuration Management

#4. Performance Test Development

  • Use cases are created for the functionality identified in the test plan as the scope of PT.
  • These use cases are shared with the client for their approval. This is to make sure the script will be recorded with correct steps.
  • Once approved, script development starts with a recording of the steps in use cases with the performance test tool selected during the POC (Proof of Concepts) and enhanced by performing Correlation (for handling dynamic value), Parameterization (value substitution) and custom functions as per the situation or need. More on these techniques in our video tutorials.
  • The Scripts are then validated against different users.
  • Parallel to script creation, performance team also keeps working on setting up of the test environment (Software and hardware).
  • Performance team will also take care of Metadata (back-end) through scripts if this activity is not taken up by the client.

#5. Performance Test Modeling

Performance Load Model is created for the test execution. The main aim of this step is to validate whether the given Performance metrics (provided by clients) are achieved during the test or not. There are different approaches to create a Load model. “Little’s Law” is used in most cases.

#6. Test Execution

The scenario is designed according to the Load Model in Controller or Performance Center but the initial tests are not executed with maximum users that are in the Load model.

Test execution is done incrementally. For example: If the maximum number of users are 100, the scenarios is first run with 10, 25, 50 users and so on, eventually moving on to 100 users.

------------

#7. Test Results Analysis

Test results are the most important deliverable for the performance tester. This is where we can prove the ROI (Return on Investment) and productivity that a performance testing effort can provide.

Loadrunner Test Results Analysis

Some of the best practices that help the result analysis process:

a) A unique and meaningful name to every test result – this helps in understanding the purpose of the test
b) Include the following information in the test result summary:

  • Reason for the failure/s
  • Change in the performance of the application compared to the previous test run
  • Changes made in the test from the point of application build or test environment.
  • It’s a good practice to make a result summary after each test run so that analysis results are not compiled every time test results are referred.
  • PT generally requires many test runs to reach at the correct conclusion.
  • It is good to have the following points in result summary:
    • Purpose of test
    • Number of virtual users
    • Scenario summary
    • Duration of test
    • Throughput
    • Graphs
    • Graphs comparison
    • Response Time
    • Error occurred
    • Recommendations

There might be recommendations like configuration changes for the next test. Server logs also help in identifying the root cause of the problem (like bottlenecks) – Deep Diagnostic tools are used for this purpose.

In the final report, all the test summaries are consolidated.

#8. Report

Test results should be simplified so the conclusion is clearer and should not need any derivation. Development Team needs more information on analysis, comparison of results, and details of how the results were obtained.

Test report is considered to be good if it is brief, descriptive and to the point.

The following guidelines will smooth this step out:

  • Use appropriate heading and summary
  • Report should be presentable so that it can be used in the management meetings.
  • Provide supporting data to support the results.
  • Give meaningful names to the table headers.
  • Share the status report periodically, even with the clients
  • Report the issues with as much information and evidence as possible in order to avoid unnecessary correspondence

The final report to be shared with the client has the following information:

  • Execution Summary
  • System Under test
  • Testing Strategy
  • Summary of test
  • Results Strategy
  • Problem Identified
  • Recommendations

Along with the final report, all the deliverable as per test plan should be shared with the client.

Conclusion

We hope this article has given a process oriented, conceptual and detailed information on how performance testing is carried out from beginning to end.

In next Performance testing tutorial we will provide you with the list of all LoadRunner video tutorials with practical load testing examples. It’s important. Don’t miss.

See Also => HP LoadRunner in-depth Video Tutorials

Please post your comments and questions below.



Get FREE eBook + Blog Updates By Email!

Subscribe to get software testing awesome articles and free resources. Enter your email address and click 'SIGN UP NOW' button.


13 comments ↓

#1 Suresh on 01.09.14 at 8:00 am

This is perfect process explanation. thank you so much for sharing. Looking for the loadrunner tutorials.

#2 udara on 01.09.14 at 9:14 am

Thank u for sharing it…

#3 Femi on 01.09.14 at 9:26 am

I just started working as a performance test analyst and found this article useful. where I work we use Silk Performer and DynaTrace to conduct our performance test. I am still learning this tools and how to analyse the result they generate. I also still need to understand the load model and benchmarking.

I believe the loadrunner tutorial will be as useful.

Thanks for this great work.

#4 Mamatha on 01.09.14 at 9:27 am

Hi

Nice article, if possible please provide about VSTS load testing also.

Thanks
Mamatha

#5 chinna krishna redyy on 01.09.14 at 10:56 am

Hi

Can you please explain the how to do performance monitoring
Do we need to use any extra tool for that or else controller is enough?

Thanks,
Krishna

#6 Chetan Kaushal on 01.09.14 at 3:30 pm

@all Thanks All for your support and appreciation…

#7 Chetan Kaushal on 01.09.14 at 3:37 pm

@Chinna krishna redyy- There are many tools available for performance monitoring. Another option which can be tried when monitoring tool is not available is to get the perfmon i.e.Performance Monitor of the servers for the duration of the test run and import the same in through Analysis. LoadRunner is capable to generates the graphs from the perfmon. Any body who can access the servers can easily provides perfmon.

#8 Bamidele on 01.09.14 at 4:04 pm

This article on performance testing is phenomenon.I found it very interesting. Thanks.

#9 Piyush Jain on 01.13.14 at 1:18 pm

Thanks for the article.. Very useful…

Can you suggest any free ware tool for testing of the webservice and xml’s.
For functional Testing we are using SOAP ui.

#10 Chetan Kaushal on 01.13.14 at 4:36 pm

@Piyush Jain-You can try Jmeter.

#11 Piyush Jain on 01.15.14 at 6:25 am

Thanks Chetan for your help. Will check with the Jmeter.

#12 Prince on 01.16.14 at 12:34 pm

Nice article. If possible, could you please provide report sample?

#13 sanjeev on 03.13.14 at 7:32 am

Good, its really helpful and clear writing to the point.

Leave a Comment