What is data-driven testing?
The ability to do data-driven testing is one of the key benefits of test automation. In data-driven testing, an automated test case retrieves input values from a data source such as an Excel spreadsheet or a database file. The test case is repeated automatically for each row of data in the data source. So, instead of 10 testers having to manually execute test cases for 10 different data values, and determine whether or not each test case succeeded, an automated test can execute 100 test cases in a single test run.
In a test case for a user registration process, the data source might contain the columns and rows like those shown below:
|First Name||Last Name||Desired User Name||Desired Password||Expected Result|
If the external source also contains validation values, then the data-driven test can compare the results of the test to the validation value to determine whether the test case passes. For example, for a test of the “multiply” function in a calculator application, the data table might look something like the following:
|Input 1||Input 2||Expected Result|
In both examples, the actual result of the test can be compared to the expected result to determine whether or not the test case succeeded.
Benefits of data-driven testing
Reduced execution time
One of the most obvious benefits of data-driven testing is that automation makes it possible to rapidly execute a large volume of test cases, especially repetitive ones that cover positive and negative test cases, or corner, edge and boundary cases.
Even the most careful tester can make errors when manually entering large amounts of data. With data-driven testing, you can be certain that the exact data values specified in the Excel spreadsheet or database are used to execute the test case.
Improved use of system and human resources
Automated data-driven test cases can execute at night, when test servers would otherwise be idle. With manual testers freed from entering repetitive test data, they can focus on more challenging exploratory and user experience testing.
Reduced test case maintenance
Separation of test data from test code reduces maintenance in several ways. First, you can easily add or remove test cases by changing the test data without affecting the test code itself. Well-designed data driven tests with error handling and conditional execution can also reduce the need for redundant test cases. For example, instead of having separate test cases to check different types of valid and invalid passwords for a welcome screen, with data-driven testing you have just a single test case to maintain.
Better test data storage
Data-driven testing allows you to store test data in a central repository, whether that’s an Excel spreadsheet, CSV file or database file. This makes the data easier to share, re-use, backup, and maintain.
Supports more than just testing
In addition to validating that your application works as expected, data-driven tests can also be used to simulate data entry for load and performance testing. It’s also possible to use a data-driven test case to populate your production database.
Best practices for data-driven testing
Separate test data from the test code whenever possible
Use data tables to provide input and validation values for your automated tests rather than storing data values directly in your test cases. Hard-coding values makes your test cases more difficult to maintain and also more difficult for another tester to read. For the same reason, you should use external data tables to supply values for test environment settings such as system variables. Read more about how to use parameters to set system variables for Ranorex test runs in the Ranorex User Guide.
Use realistic data
Ensure that the data in your data table provides adequate coverage of your test scenarios. Include both positive data values that should succeed, and negative data values that should return an error. Boundary value analysis can be helpful in producing data to ensure coverage of both positive and negative test cases. For example, if a number field accepts values only between 1 and 100, your positive test cases would include 1, 2, 99 and 100. It’s not necessary to test all of the numbers in between if the test case succeeds for these numbers. Negative test cases would include -1, 0, 101 and 102. Another approach to data-driven testing is to execute test cases against a subset of your production data, which helps ensure that your tests cover the types of data that the application actually processes.
Use setup/teardown modules
Each test case should configure the test environment that it needs, including test data, and clean up afterward. So, if your test case reads a number of rows from an Excel spreadsheet and inserts them into your application, the test case should include a teardown step to delete the records that it created. This practice keeps your test cases independent of each other and increases the chances that the entire test run will succeed.
Configure error handling
Include logic in your data-driven test to determine what should happen if one test case fails. For example, Ranorex supports the following options for handling a validation error: “continue with iteration” which will continue the test case with the next row of test data, “continue with sibling” which will end the current test case and continue with the next one at the same level, “continue with parent” which will end the current test case and continue with the next parent test case, and “stop” to abort the test run entirely.