This article describes how to use the Ranorex Studio IDE and the Ranorex API for test automation in your behavior-driven development (BDD) process. BDD requires a cognitive shift: instead of thinking about testing functions, you are now thinking about behaviors....
Sample Test Cases – Airplane Ticket Reservation
Learn how an optimized set of scenarios for a ticket reservation application is efficiently generated in DesignWise.
What are we testing in this example?
What are our testing objectives?
We know that testing each item in our system once is not sufficient; we know that interactions between the different things in our system could well cause problems. As thoughtful test designers, we want to be smart about testing for potential problems caused by interactions without going off the deep end and trying to test every possible combination.
DesignWise makes it quick and simple for us to select an appropriate set of tests whatever time pressure might exist on the project or whatever testing thoroughness requirements we might have. DesignWise-generated tests automatically maximize variation, maximize testing thoroughness, and minimize wasteful repetition.
What interesting test design considerations are raised in this particular sample plan?
The drop down menu above shows 6 different quantities. The drop down list under Children includes “zero” as an option as well.
- An obvious solution would be to include separate Values for each of the numbers.
- If we did that though, we would create a plan with at least 6 X 7 = 42 tests.
- An alternative option that we’ve decided to use here is the idea of Equivalence Classes.
- Here’s the basic idea behind equivalence classes: whether there are 2 tickets or 6 tickets probably won’t impact the system. Or so everyone tells us. So we decide to describe our system using a smaller number of ticket quantity Values (in our case: 0, 1, and “more than 1”). The equivalence class test design approach has the advantage of keeping the number of tests much lower without sacrificing testing coverage.
It is often useful to start by identifying a verb and a noun for the scope of our tests
Designing powerful software tests requires people to think carefully about potential inputs into the system being tested. As described in this blog post, we strongly encourage test designers to ask the newspaper questions of who?, what? when? where? why? how? and how many?
- What class of ticket is being reserved? (Coach, Business, First)
- Where is the flight leaving from? (India, the Philippines, the United States)
- Where is the flight going to? (the United States, the Philippines, India)
- How many adult tickets? 1 to 6 –> (1, more than one)
- How many children tickets? 0 to 6 –> (0, 1, more than one)
Variation Ideas entered into DesignWise’s Parameters screen
Once we enter our parameters into DesignWise, we simply click on the “Scenarios” link in the left navigation pane.
DesignWise helps us identify a set of high priority scenarios within seconds
DesignWise gives test designers control over how thorough they want their testing coverage to be. While this is a very simple plan with only 162 total possible scenarios, in many cases, DesignWise would allow testers to quickly generate dozens, hundreds, or thousands of tests using DesignWise’s “coverage dial.”
Selecting “3-way interactions” generates a longer set of tests which cover every single possible “triplet” of Values
If a tester spent a couple hours trying to select tests by hand that achieved 100% coverage of every single possible “triplet” of Values (e.g., (i) From the United States, (ii) flying First Class, and (iii) with More than 1 child), the following results would probably occur:
- It would take far longer for a tester to try to select a set of tests that would be nearly as thorough.
- The tester trying to select tests by hand would create far more than 34 tests (which is the optimized solution, shown above) to achieve this extremely thorough 3-way coverage goal.
- Almost certainly, if the tester tried to achieve this coverage goal in 50 or fewer tests, there would be many, many gaps in coverage (e.g., 3-way combinations of Values that the tester accidentally forgot to include).
- Unlike the DesignWise-generated tests, many of the tester’s hand-selected scenarios would probably be highly repetitive from one test to the next; that wasteful repetition would result in significant wasted effort in the test execution phase.
Auto-scripting allows you to turn scenario tables (from the “Scenarios” screen) into detailed test scripts
You document a single test script in detail from the beginning to the end. As you do so, you indicate where your variables (e.g., “the United States” or “India” or “the Philippines”) are in each sentence. That’s it. You’re ready to export all of your tests.
From there, DesignWise automatically modifies your sample/template test script and inserts the appropriate Values into every test in your plan (whether your plan has 10 scenarios or 1,000).
We can even add simple Expected Results to your detailed test scripts
DesignWise’s Expected Results feature makes it easier to maintain your test sets over time because Expected Results automatically update as test sets get changed over time.
Coverage charts allow teams to make fact-based decisions about “how much testing is enough?”
With DesignWise’s matrix reports, we can even tell which specific pairs have not been tested after any test
If you open up this sample plan and look at the matrix chart view when number of tests is set to 10, you will see that the entire chart is green (because all pairs of values have been tested together at least once in the plan’s 10 tests).
Before exporting our tests to discuss them with stakeholders, we can review the Test Plan Scorecard
Mistake Identification: We have helped testers build test plans for years. In doing so, we have seen the most common test design mistakes that new users sometimes make. We have put in automated warnings to alert testers when something looks like it might be wrong with their set of tests.
Feature Usage Ideas: The scorecard identifies features that have not yet been used in your plan.
Exports from DesignWise can look like this…
As shown below, detailed test scripts (complete with tester instructions and Expected Results) can be exported also.
Other possible export formats could include test data tables in either CSV or Excel format or even Gherkin-style formatting.