Non-Functional Testing
“Quality is not an act, it is a habit.” — Aristotle
Testing and continuous monitoring of NFRs are key to access production readiness and trend of Software qualities like stability, performance, security, etc. in a given context. Like functional testing, NFR testing should be part of the development phase and planned from the start. Testing internal software quality also plays an important role in achieving the external qualities and quality in use.
The overall NFR testing & monitoring should be done during all the phases of development. During the development phase, architecture and code should be evaluated for “design for testability,” “duplicate code,” “code complexity,” “cyclic dependencies,” etc. the testing as part of integration testing and system testing.
Like any other functional requirements, Quality requirements must be quantified appropriately and practically to ensure stakeholder’s needs are clearly understood and agreed upon by everyone. To achieve quality goals effectively, NFRs need to be well defined, approved and enforced.
ISO 25020 establishes the guideline and approach for determining the quality measures. Commonly NFR Scenarios are used to capture stimuli, environment, and response. Appendix B contains the example refinement scenario.
Another interesting way to define NFR is defined in the scaled agile framework. Here in step 1 define the NFR’s quality name, scale, and method to measure and, in step 2, quantify the NFR’s measurable values, including the baseline (current measured value), target(value to achieve), constraint(a value that becomes unacceptable). To capture NFR this way below template should be used -
In the Agile landscape, this NFR requirements may be part of user story acceptance criteria and independent user/technical stories to test the NFRs as part of “system qualities tests.” Overall, these NFR requirements must be bounded, independent, Negotiable, and testable.
NFR Test Execution Requirements
- Targeted product’s NFR Specification
- Targeted product’s Stable release
- Test environment to perform NFR tests
It is essential to perform monitoring in a stable and fixed environment to make sure repeatability and reproducibility. Setting up the proper environment for NFR testing is very important as it may invalidate testing or create doubt around the test results. Consider the points below when setting up the NFR testing environment.
- Production-like environment setup, which includes configuration, data, resources, and monitoring
- The test environment specification should not change between the test iterations like changes in hardware specification, dataset, etc. This will make it difficult to compare and the test result and identify the trend
- In case of any environment-specific deviation, the deviation should be documented, and the same should be considered while executing the tests.
- Only a functionally stable version of the software should be used for NFR testing
- Most of the time, external interfaces are not available for NFR testing. In such cases, the mock/simulator system should be used and mentioned as part of the testing report.
- Inform third-parties in case NFR testing is done with the actual third-party interface.
- Document the test execution environment set up to keep the repeatability (configuration, dataset, scenarios)
Below is the list of implicit or explicit constraints for NFR testing may be applicable as per the specific context.
- Specific stakeholder needs
- Resources Availability
- Schedule constraints
- Cost limitation
- Environment
- Tools and methodology
- Reporting
- Knowledge database for NFR testing
The team must evaluate these constraints during the NFR test planning and, if applicable, an action plan should be prepared to handle them.
NFR test cases must be derived from the real scenarios and well defined using the proper acceptance criteria. A test case should always cover user case reference, test scope, test metrics, and dependencies. Depending on the type of NFR test case, the test case should also include the specifics, like performance test should consist of desired response time under how much concurrent load for given resources (CPU, Memory, etc.). This test case should also provide traceability to the NFR requirement. The team should also work on identifying what all test cases can be automated to reduce recurring work.
Categorize NFR test cases in the full and lightweight test suite. Teams can run the lightweight test suite frequently to get early feedback, and the full test suite can run against a release or significant delivery. Again, stakeholders should decide the frequency of these tests keeping resources and time needed to run them. In an ideal scenario, these tests should be part of a continuous integration pipeline to provide early feedback.
NFR test planning
NFR test planning should consider priorities NFRs, development cycle, resources such as personnel, measurement automation, software, and hardware environments. The testing plan to decrease the risk of errors and reduce the planned effort, considering at least the following:
- Budget
- Priority and strictness of quality attributes
- Schedule and resources involved
- Application of measurement result
- The relevance and importance based on the quality requirements and context of the use
In the agile world, an Incremental story-by-story path is suggested, where user stories may have quality requirements in acceptance criteria and focused user stories for specific quality requirements. Keeping quality requirements in associated functional user story acceptance criteria helps the team to focus on the quality requirements from the start. In relevant Sprints where NFR is focus or have the majority of user stories that impact the NFR, such Spring goal should also include the quality goals.
NFR test planning should allocate the resources for automating the NFR tests to ensure that repetitive testing steps are automated. Automated tests help the team to execute the NFR tests whenever required. The automation script should include the phase of preparation, execution, cleanup, and reporting after the test. Automation also reduces the manual effort as well as the mistake done while executing NFR tests manually.
NFR testing is an iterative process. Keeping a log of the NFR testing records helps identify the following -
- Define baseline
- Latest state of product quality
- Focus area for optimization
- Identify Trend
- Early feedback to the development in case quality is deteriorating
Keep testing records as per the defined template. These records should contain measured values as well as the environment variable under which testing was performed. If this data is not recorded, it becomes difficult to validate, compare, and find the quality trend. Records should also keep the shreds of evidence like graphs, logs from the application, and tools. This document should also provide a summary of the test. NFR Testing results should be frequently presented to the team, like in Sprint demos, to ensure critical NFRs are closely monitored.
So in brief -
- Non functional requirement as key for application usability
- Should be tracked and monitored from initial development stages
- Automate and Automate, for better execution result
- Document it!!
In the next part, I will try to cover the commonly used tools for non-functional requirement testing till then chao chao!!