Prego had an advertising campaign for spaghetti sauce in the 80s and 90s. In the commercial, someone is stirring a pot of sauce and another person walks up and asks “what about garlic, what about onions, what about tomatoes”? The answer was always “It’s in there!” Everything someone wanted to know about Prego was listed on the bottle; there were real ingredients. In testing, we have thick documents that outline each step of the testing process and who does what. When someone asks about our testing strategy, we point to the strategy document and say “it’s in there.” Real testing work is usually missing from that plan.
Get TestRail FREE for 30 days!
Regression, Story Tests, Automation, Pyramid … It’s in There
Rather than pointing to an old document no one cares about, let’s learn to talk about what’s really in our test strategy.
I’ll use the project I’m on now as an example. My side of the work is acting as a sort of one-person software testing show. Each day, I run a UI automation suite and analyse the results. Every day there are also tests to build, bad tests to debug, and product problems to explore. Embedded in all of that, every day, is regression testing, stories, automation, and a testing pyramid.
Regression testing is handled nightly by a combination of an automated test suite and me doing focused exploration. The automated test suite runs overnight on four different environments and sends results to a few email addresses. In the morning I read that email, look at log files produced by my automation framework, and log files from the software I am testing. If a developer made a change that broke something, for example a page that will no longer save because of a recent performance refactor, my test suite will catch it and send out an alert.
Stories come at an unpredictable pace and in different shapes. Sometimes developers make new features and will send them my way with a little information about the change. Other times they make wide ranging performance updates so that data will render a little bit faster. Story testing is generally wrapped up in my automation efforts. I look at the ticket and try to understand the change, and then talk to developers and product managers. After that I’ll spend some time working with the change to get a feel for things, and usually discover bugs while doing that. Building UI automation is yet another bug finding exercise. The result of this is new test automation, some bug fixes, and maybe a few changes to the feature.
We can see a version of the test automation triangle when we step back and look at all of that. The architecture of this product is a little different, so our pyramid has a different shape. Unit testing is difficult because of the underlying technology stack; we have some unit tests, but they aren’t the basis of our automation efforts. At the bottom of our testing pyramid is UI automation, on top of that is exploration by a person, and then on top of that is unit tests and a handful of service layer tests. Exploration is in the middle in this model because I believe it is impossible to build good automation without a person exploring the product at the same time.
It’s in there!
Once every few months, the development team and I have a meeting to talk about test coverage — what happens in the nightly test run, what is missing or could use some refactoring, what do I need to focus exploration around, and what is in the pipeline. There is no document, but everyone on our small development team understands what the test strategy is.
Your Testing Recipe
Text in a plan never quite describes the work we do. We change from exploration, to heavy tool usage, to pairing with a developer, to adding completely new test layers almost seamlessly throughout the day. Keeping a plan updated for our changing approaches could be a full-time job, and even then, it would get some things wrong.
Here’s a tip: Build a test strategy if you must, but keep it short, somewhere around one page. The real way to communicate your test strategy is by working closely with your team. Take time to pair with developers, ask product managers what the customer values in a new feature, ask your support people what problems they have been hearing about recently. Show them exactly how you are testing, and what that has helped you learn about your product. Through all of this, your team will learn more about how you test and may even have suggestions for new strategies to add, new tools to use, and new places in the product to focus.
This is a guest posting by Justin Rohrman. Justin has been a professional software tester in various capacities since 2005. In his current role, Justin is a consulting software tester and writer working with Excelon Development. Outside of work, he is currently serving on the Association For Software Testing Board of Directors as President helping to facilitate and develop various projects.