This is a guest post by Peter G Walen.
You get handed an assignment of strategic significance to the organization. You have never worked on this application before and the only information you have is what this change will do. There is nothing on how the system or app currently work or how this change impacts other systems and apps that touch it. And no one can answer your questions. What do you do next?
This is one of those things that “should never happen” and somehow it seems to happen at a lot of shops. Big project that is important and somehow, no one can provide information about it. It could be a smaller project that is “a minor change.” Your tester sense kicks in because too many times, “minor” changes can bring havoc down on all involved.
In a well-run, high functioning environment, people know that when things go wrong, it is not the fault of the person or people doing the testing. Still, sometimes they do happen. And it can in any shop.
There is a project that needs attention and it lands on your desk. It is kind of a shambles with no real project documents. Requirements, such as they are, are a list of acronyms and buzzwords scribbled on a napkin or post-it note.
It could be a legacy system running since the late 1970s. Tweaks, changes, and improvements have been made. Somehow, “system documentation” dates to the original effort and “project documentation” for the changes is non-existent, as if the people who worked on it in the past would rather not have anyone know.
I have worked in all those situations and had people expect solid and thorough testing each time. Here is where I found test ideas.
This one sounds easy, but I find it rarely to be as easy as I’d like. Each organization has a “meeting vibe.” Sometimes people are “not available” unless you schedule a meeting. Or they don’t like meetings and don’t want anything “on the record” that they talked about the project. They could be pressed for time and trapped in high stakes conversations all the time.
What I have found to work well is figure out the meeting vibe of the people you think might know something. Then conform to that. Informal might be best for most. Getting a coffee with Bob or Laura and “getting away from the desk” for a few minutes might help with some insight.
It might help you, and them, open up about what you need to know. Check their energy level and see if small talk to ease into the questions you need to ask might be in order. Sometimes getting straight to the point is more what they want or need.
Each person you want to speak with, from whom you need to gain information, has their own needs, problems and things to get done. Work to build a rapport with them, build the trust that is needed for people to talk freely. Some are more open than others and will tell you their life story at the drop of a hat.
Each of these contains nuggets of information that tell you something about them, their experience and their background. That can help you learn what they need. With them, you can help them while they are helping you. The stories may also be giving you insight into the system itself. Pay attention.
A fair amount of time, people will respond positively to your questions about the system, if you make it clear you are merely trying to understand it better. Make it clear you are not expecting them to take on more work (if this is true!) After all, most people are well laden with tasks and don’t really need anymore, right?
What approach do I find works best? None of them. And all of them. Build a relationship where you can have a conversation and get their understanding of the system, even if it has changed since they worked on it. Then ask them.
When you talk with people, they may have gems of information in files they kept from when they worked on the system. When I was a developer, I followed the lead of the seniors who first trained me. I saved everything around a project, including notes written on napkins and scratchpads from conversations and working out coding problems.
Why is this important?
Some organizations are better than others about saving project documentation. Most of this is of little real use to us in learning how the system currently works or how the proposed changes might impact that behavior and usage.
Formal artifacts and developer notes will likely be out of date from a technology perspective. Still, they can give crucial insight to how the system developed and what drove the changes and enhancements.
They can also help us understand how the system has developed and evolved over time. They can give insight into how perceptions about the system have changed as well as how people’s interactions with the system have also changed.
These can give us a better understanding of how we can exercise the system, before and after the changes. They can also, help you build a meaningful approach to testing from this history.
Another technique I have used is to watch people actually use the software. Not in training, or in a demo, but watching them use it as they would to do what they want or need to do.
I know this is easier with software developed for use within the organization. For commercial software, it makes a pretty compelling argument to “eat your own dog food” as much as possible. Why do I say that?
People will develop their own rhythm of working. They likely won’t follow the “official” path or process very much, if they can find a path that is easier, less complex and gets the job done faster. Few people will “follow the rules” when it comes to making software.
When making software, corners will be rounded a bit. Documentation might be “filled in later.” Teams will push forward with what seems a reasonable course when questions aren’t answered. Since we know this to be the case, why do we presume that people using the same software will use it precisely as the training material tells them to?
How many people using software in an office environment are really “trained” in the software, unless they were there when the software was introduced? More often, they will be “shown” how to use the software in a manner aimed at “getting the job done” instead of “this is how the software expects you to do things.”
One other aspect of observing people with the software: it shows how people take advantage of a “bug,” perhaps allowed to go into production because it could be “fixed later.”
If the fix changes, or “breaks,” how people use the software, is it really fixing anything? Are you fixing problems or causing them if you force paying customers to change how they use software or how they run their business?
Learn how people use the software, not how internal experts think they use their software. Then use that information to help model your testing.
I find this set of ideas to be the murkiest to navigate, depending on the organization.
System and application logs, call center and help desk logs, and customer change requests can help identify vulnerable areas within the software.
If you can identify key phrases to search for in the technical logs, a good research tool to pick out messages might help (I’ve used Splunk with good success.) The call logs and change requests can give insight to what people don’t understand or don’t like within the software.
From this collection, you might be able to sift out some sense of how stable the system is or how changes can unexpectedly impact other areas. This can provide a reference to how deeply you might need to engage in various integration tests to look for unexpected touchpoints.
Putting It Together
You now have got a massive amount of potential areas to consider when developing a test approach to this system. No single piece will give you everything you need to know.
It is possible the collection of ideas will fall short of giving you a solid plan for testing.
However, you likely have more information available than any other single person in the organization about the system. How it works, how people use it and how the change is likely to impact them.
Make your plans and proceed with confidence. You are now the expert on the system.
Peter G. Walen has over 25 years of experience in software development, testing, and agile practices. He works hard to help teams understand how their software works and interacts with other software and the people using it. He is a member of the Agile Alliance, the Scrum Alliance and the American Society for Quality (ASQ) and an active participant in software meetups and frequent conference speaker.