Testers execute tests, right?

Of course, we do. However, assuming that is all we do leads to all kinds of problems. Let’s take a deeper look at a wide range of testing responsibilities and opportunities.

Most of the members of an enterprise assume that developers create applications, salespeople bring in the money, executives inspire the troops, and testers help smooth out the rough edges of applications when programmers are nearly done. Left to themselves, all the other team members hold fast to this simple idea. Without any particular intent to do so, they’ll help keep us in a “testing box” where testers receive untested software just before it’s released, pass back bug reports, and, well … that’s it. In this approach, the entire testing story is limited to helping the programmers write the code twice: once with a defect, and once without that defect. Yet we can do so much more.

Encourage incremental automation

A particularly valuable contribution to emphasize has to do with what I call “incremental automation,” or “documentation is automation”, as Tom Limoncelli styles it in his recent article for the ACM. While his remarks target systems administrators, I’ll repurpose them for testing teams.

One of the limits on the perspective outsiders often have of testing is that our automation is binary. Other team members think either that testing is automated, with tests all running in a distant lights-out datacenter while testers sit around idly waiting, or that it’s not automated, but instead testing monkeys bang on keys until we get something to break.

Both those extremes contribute to views that testing demands little meaningful skill.

Limoncelli nicely models how much territory lies between those two extremes. When we receive a new product artifact, we might stumble our way through a first run slowly and painfully. As we document the behavior, though, we automate more and more of the product’s exercise. Testing is a highly skilled pursuit, and a crucial part of that skill is the incremental automation of testing the products for which we’re responsible.

Encourage incremental automation

A particularly valuable contribution to emphasize has to do with what I call “incremental automation,” or “documentation is automation”, as Tom Limoncelli styles it in his recent article for the ACM. While his remarks target systems administrators, I’ll repurpose them for testing teams.

One of the limits on the perspective outsiders often have of testing is that our automation is binary. Other team members think either that testing is automated, with tests all running in a distant lights-out datacenter while testers sit around idly waiting, or that it’s not automated, but instead testing monkeys bang on keys until we get something to break.

Both those extremes contribute to views that testing demands little meaningful skill.

Limoncelli nicely models how much territory lies between those two extremes. When we receive a new product artifact, we might stumble our way through a first run slowly and painfully. As we document the behavior, though, we automate more and more of the product’s exercise. Testing is a highly skilled pursuit, and a crucial part of that skill is the incremental automation of testing the products for which we’re responsible.

Encourage incremental automation

A particularly valuable contribution to emphasize has to do with what I call “incremental automation,” or “documentation is automation”, as Tom Limoncelli styles it in his recent article for the ACM. While his remarks target systems administrators, I’ll repurpose them for testing teams.

One of the limits on the perspective outsiders often have of testing is that our automation is binary. Other team members think either that testing is automated, with tests all running in a distant lights-out datacenter while testers sit around idly waiting, or that it’s not automated, but instead testing monkeys bang on keys until we get something to break.

Both those extremes contribute to views that testing demands little meaningful skill.

Limoncelli nicely models how much territory lies between those two extremes. When we receive a new product artifact, we might stumble our way through a first run slowly and painfully. As we document the behavior, though, we automate more and more of the product’s exercise. Testing is a highly skilled pursuit, and a crucial part of that skill is the incremental automation of testing the products for which we’re responsible.

Take the initiative

Our goal here is to improve product quality by expanding the role of testing beyond the “testing box”.  Waiting for someone to ask, “How do these requirements look to you?” is a great way to stay inside the box. Take initiative. It’s good practice for all the other times you’ll need to speak up so a testing perspective can be heard.

Taking initiative doesn’t mean you’re being disruptive, not trusting others or boring them. Often what couldn’t be more obvious to testers is invisible to those with their focus elsewhere. They need you to contribute what you know.

This can be as easy as looking at the backlog in a tracking tool, like Jira or Rally, before the stories are ready for development and test. Many companies define acceptance tests before coding begins, and testers may be uniquely qualified to make sure those acceptance tests are meaningful and detailed enough to provide a real pass or fail.

Making sure it is possible for the programmers to write automated checks against those acceptance test ideas can mean higher code quality when the feature is ready for specialized testing, and fewer regressions over time.

Travel the high road

Imagine an ambitious mobile-oriented project. Some time after it launches, a project manager comes to you and says the organization has committed to delivery on handsets from 14 different Android vendors. She notices the look on your face and hurries to reassure you: “It’s OK! We’ve already budgeted for it; Randy’s going to run down to the shopping center and pick up one of each device this weekend.”

Take a breath. Keep the word “opportunity” in mind. Remember that this is a chance to get across the fundamentals about how the overt capital expense pales in comparison to the human effort of essentially performing the same tests on 14 different devices. If you prefer, you can emphasize a counterproposal: Sure, if the deadline is fixed, your team will finish testing three installations by then, rather than all 14. Perhaps you test one popular configuration especially well, two more modestly well, and perform “a light dusting” of core features on the rest. At every turn, deliver high-quality strategic responses, rather than retreating to “you all are crazy” reactions — however tempting the latter may be.

Encourage incremental automation

A particularly valuable contribution to emphasize has to do with what I call “incremental automation,” or “documentation is automation”, as Tom Limoncelli styles it in his recent article for the ACM. While his remarks target systems administrators, I’ll repurpose them for testing teams.

One of the limits on the perspective outsiders often have of testing is that our automation is binary. Other team members think either that testing is automated, with tests all running in a distant lights-out datacenter while testers sit around idly waiting, or that it’s not automated, but instead testing monkeys bang on keys until we get something to break.

Both those extremes contribute to views that testing demands little meaningful skill.

Limoncelli nicely models how much territory lies between those two extremes. When we receive a new product artifact, we might stumble our way through a first run slowly and painfully. As we document the behavior, though, we automate more and more of the product’s exercise. Testing is a highly skilled pursuit, and a crucial part of that skill is the incremental automation of testing the products for which we’re responsible.

Test more than just software

A final value a testing background brings is the ability to test more than just software. Many of the same principles that apply when you test applications also can help verify hardware, security policies, documentation, requirement drafts, workflows, tooling, and data, as well as other organizational assets.

Be on the lookout for all the ways verification helps the organization, and to speak up as you notice them.

In summary, follow these five steps:

  • Make sure all requirements are testable
  • Take the initiative
  • Look for opportunities to elevate the conversation
  • Automate continuously, even if partially
  • Apply the testing mentality broadly

Do these things, and any responsive organization will catch on that you provide plenty of value beyond just sitting at your desk, holding up approval for product releases while your tests finish.

To explore the features of Ranorex Studio, download a free 30-day trial today, no credit card required.

About the Author

Cameron Laird is an award-winning software developer and author. Cameron participates in several industry support and standards organizations, including voting membership in the Python Software Foundation. A long-time resident of the Texas Gulf Coast, Cameron's favorite applications are for farm automation.

You might also like these articles