Lite Test Plan

As testing becomes more agile - the large test plan has often become a distant relic. We still need to capture non functional requirements, just focusing our attention towards the critical detail.

So how can we condense the large test plan into something small and concise?

NB - Much of this detail can be captured in your test management software but it still makes sense to present your test plan to ensure the audience view it and understand it.

One Page Performance Test Plan

Consider creating a table (3 x 2) and capturing the key details. This could be on a Wiki, spreadsheet or direct into email. Whatever the delivery method ensure that all of the key stakeholders gain visibility.

In this example we have manage to condense the following onto one page:-

  • Introduction - provide a project overview to ensure the audience understand the context.

  • In Scope / Out of Scope - clarify what will and won’t be tested.

  • Risks / Assumptions - call out any risks or assumptions.

  • Environment / Tools - state the target environment and the test tools to be used.

  • Resources / Time scales - name the person responsible for the testing and time scales.

  • Test Activities - state the testing activities including scripts, test types, test data, volumes, monitoring & reporting.

  • Acceptance Criteria - state the measurements for test success/failure.

You could always split out further detail to separate files and link to them either directly on the page or as part of the delivery method.

Several Page Performance Test Plan

If there is more detail than you can physically fit on to one page, condense the information down to a small number of pages.

Look to create a slide deck, given the space restriction per slide this should still help focus towards concise content.

Again, where extra detail is required split this out to separate documents and provide links.

NB - Try not to go crazy small on the font to try and copy and paste the original plan.

Static HTML page / Wiki

Alternatively, why not create a simple HTML table. This could be on a Wiki or a self contained HTML page that could be distributed.

Purpose The purpose of this section is to provide a high-level overview of the performance testing approach that should be followed for the <PROJECT> project. This must be presented to all the relevant stakeholders and should be discussed in order to gain consensus.
Introduction As part of the delivery of the <PROJECT>, it is required that the solution meets the acceptance criteria, both in terms of functional and non-functional areas. The purpose of this document is to provide an outline for non-functional testing of the <PROJECT> solution.

This document covers the following:
Entry and Exit Criteria
Environmental Requirements
Volume and Performance Testing Approach
Performance Testing Activities
Entry Criteria The following work items should be completed/agreed beforehand in order to proceed with the actual performance testing activities:


Non-functional test requirements document provided by <CLIENT>, with quantified NFRs where possible
The critical use-cases should be functionally tested and without any critical bugs outstanding
Design Architectural Diagrams approved and available
Key use-cases have been defined and scoped
Performance test types agreed
Load injectors setup
Any data setup needed - e.g. Appropriate number of users created in <DATASTORE>
Exit Criteria The performance testing activity will be completed when:

The NFR targets have been met and performance test results have been presented to the team and approved.
Environments The performance tests will be run against a stable version of the <PROJECT> solution (which has already passed the functional tests) and performed on a dedicated production-like environment (pre-prod?) assigned for performance testing with no deployments on that environment during the course of the performance testing.

Load Injectors
There will be one or more dedicate “load injectors” set up to initiate the required load for performance testing. The load injector could be a VM or multiple VMs that have an instance of JMeter running, initiating the requests.

Test Tools
Test tools used for Volume and Performance testing will be:

JMeter
An open-source load testing tool. Predominantly used for volume and performance testing
Volumes The <PROJECT> solution should be performant enough to manage the following load criteria.

N.B. The numbers in the following table are for sample only - real values should be inserted once finalized by <CLIENT> NFR document.

Target Service Volumes
Hourly targets are discovered from the current solution for [Y2020]. Cleared other ‘example’ values from plan template. Since hourly peak values are not high, they will be taken as target for fixed load test. Scaling factor is TBD right now.

Number of Users
Performance testing will run with a maximum of 1000 users. The users will be created in <DATASTORE> beforehand and be accessible via <PROJECT> Login API. Each request will login with different userID.

Assertions
JMeter tool will be used to execute performance testing scripts. Within the scripts, there will be assertions stated to check for the above metrics as well as some basic functional checks to ensure correct responses are received for each request.

Load Profiles
Baselining
Load Testing
Stress Testing
Spike Testing
Soak Testing
Testing Activities Performance Test Environment Build
Use-Case Scripting
Test Scenario Build
Test Execution and Analysis
Post-Test Analysis and Reporting
Risks / Assumptions Call out any risks or assumptions.
Acceptance Criteria State the measurements for test success/failure.
Previous
Previous

Load Testing Tools

Next
Next

JMeter (Master/Slave)