BLOG
31 August 2018

Requirements Analysis – Testing Beyond Clicking

tech

If you are a software tester, how often have you heard that your job is just clicking in an application that is being tested? Has it ever happened that someone, basing on that assumption, described your occupation as less important than the programmer’s? For a valuable tester, it takes more than just “clicking” and following test scenarios. Just as for a valuable developer, it takes more than adjusting and implementing solutions from Stack Overflow. But this article is not about making that comparison. It’s about showing how broad the job of a tester is. It’s something more than just saying “it works”. And one aspect of that is caring about requirements.

Be a guardian of proper documentation

Documentation Guardian
Documentation Guardian

While thinking about what a tester can do beyond clicking, you may have thought of various ways they can contribute to automation. Sure, that’s one of many directions, and the market shows a real demand for people with this skill. Today, it seems natural that a tester should learn at least a scripting language – even just to level up their own work. But usually, to automate something, they need to receive something from the developer beforehand – the GUI or API. So… What if development hasn’t started yet? Write test cases and just wait for a build? Wrong. Testing starts with testing requirements.

Don’t confuse this with the job of project manager, product owner or any other analyst involved. Their task is (among many others) to collect requirements and decide when they should be implemented. The job of a tester is to make sure they are of high quality and testable. At the beginning of a project, this might be hard. Ideas are still coming in, the client is not sure what the app is going to look like – it’s not easy to predict obstacles. Many projects are Agile-oriented though. Requirements are coming in, not all at once but in sprint after sprint – and they change a lot. If that’s the case, after some time, the tester should already know the goal everyone wants to achieve.

A requirement can be defined (in simplified terms) as a condition needed by a stakeholder to solve a problem or achieve an objective – and must be documented. Verifying the latter is a good way to start. Work on a project is often organized by means of some issue tracking software where requirements should be documented. But sometimes decisions are made in meetings or long email exchanges between small groups of people. Make sure that there are no misalignments between these sources. It would be a waste of time if someone puts their effort into a task with an outdated description only to discover that he was left out of the email loop. Seems superficial, but such mistakes slip in easily.

Keep a list of criteria to establish a standard for the project

There are a lot of criteria when it comes to validating requirements, and not all of them can be verified by a tester or developer. One of them is as simple as the necessity of a requirement. But who is to decide that a requirement is not necessary? Sometimes a development team might have such opinion, especially if there is a lot of work to be done for little value, yet that’s something to be resolved by the analysts as they collecting requirements. Very often, it’s too late for that. But when it comes to setting a sprint backlog, some things actually can be affected, and proper requirement quality can be achieved.

Feasibility

This means the possibility to implement within known capabilities. This can be of special interest in mobile applications projects. As when a tester tries to look for some generalizations like: “works on all Android devices”. That’s most probably impossible. Even if you know which devices or versions of operating systems are supported at this point, try to make sure that this information is included for the benefit of future team members. In the future, supported devices may change, and someone revisiting the ticket in search of information might be confused. When development is done for Android, iOS and Windows, it may happen that one way of designing the application view will not be applicable to another platform. Make sure that such a requirement is split for each operating system. This will bring a lot of benefits – and if a separate test case for each ticket is a project standard – you will get a lot of clarity by not mixing things up.

Feasibility
Feasibility

Unambiguity

This means that something can be interpreted in only one way. Imagine a situation in which a tester creates a test case before the developer starts working on a ticket. Later, it turns out that one of them understood the criteria wrong, just because they weren’t clear enough. This could easily have been avoided with proper wording of the requirement. And when things are eventually messed up, at least one of these people will have to spend additional time adjusting the work that has been done. Imagine what can happen when there is a requirement with three developers assigned to it – each of them for a different mobile platform. If everyone will work according to an interpretation based on their guesses, things will get complicated fast.

Singularity

Make sure that requirements don’t contain information that can be separated. This can be beneficial from the standpoint of writing test cases and maintaining their traceability. The more atomic requirements are, the easier it will be to measure their coverage during the development and testing campaign. Don’t get scared that more tickets may appear – it pays off eventually. If you see the potential to make things simpler (not get rid of them but organize them better), talk about it with the project manager or product owner. If you’re denied, it doesn’t mean that the project will collapse. But it’s always good to strive for better efficiency.

Consistency

Requirements can’t conflict with each other. The risk of conflict can be especially high when you are working on a product that is part of a broader environment – and all of the applications will have to cooperate with each other correctly. It can be hard for managers to catch all the nuances, but when doing integration testing, you might start to recognize some patterns within the environment. It doesn’t have to be a conflict in functionality itself, it can just be in the nomenclature of products. Therefore, make sure that a unified style is maintained between requirements.

Traceability

In simple terms, these are links between requirements and other project artefacts – user stories, bug reports, test cases, test campaign executions and so on. Imagine a situation in which the product manager is asking about the status of test coverage during a meeting. How to answer that without using general, abstract words than can mean everything and nothing? If you make sure that traceability is maintained, this will be an easy task – many issue tracking systems include a feature that will show the exact percentage of coverage. Not having traceability may lead to imprecise estimates – especially when requirements change often and reworks are needed. It will be easier to identify areas that are affected by particular changes in software, therefore test campaigns will be more efficient and easier to prepare – you will know exactly what tests should be included.

Testability

When reading acceptance criteria, you may notice things that are not testable – or will be testable only under specific conditions that are difficult to reproduce. Inform managers about such situations. Maybe something can be done to provide such testability – something that wasn’t explicitly requested, even as simple as additional logging to a file. This may add only a bit to the programmer’s workload but can lead to benefits in the future. If everyone agrees that a requirement is not testable in the end (eventually, full regression testing will verify proper implementation), it might be good for the QA team to label the ticket as not required to be tested and exclude it from test coverage tracking.

There are a lot of benefits in caring about requirements. Most importantly, they will be of high quality. You avoid potential messes and wasting time close to the end of a sprint. You might be appreciated for your diligence if you notice that a requirement is outdated, missing or creating inconsistency. Ultimately, you prove that testing is something beyond clicking, and you’re valuable to the team.



Author
Jakub Bielawiec
QA Tech Lead