Data-driven testing is all about making test automation more efficient.
Test automation is critical for efficient software delivery. Automated tests run 24 hours a day, 7 days a week, meaning they can get through huge numbers of tests relative to manual testing. However, test automation can be pretty inefficient if you do it badly. Automated tests are often repetitive with just minor changes in test data. This is when data-driven testing comes in.
What is data-driven testing?
Many automated tests involve repeating the same test step with different variations of data and expected outcome. For instance, imagine testing the address fields on a registration form. If your application needs to work in multiple regions, you need to be able to test a huge range of different valid and invalid addresses. This means setting each region in turn, checking the correct fields are displayed, and then entering test data in each field to check it works.
Data-driven testing simplifies this significantly. Rather than create a different test for each region, you create one test. Then you use a data source to set the test data and validation criteria for each region. This has the key advantage of separating the test logic from the test data.
How to implement data-driven testing
Data-driven testing is slightly different from test automation. In a standard test, you include any required data in the test itself. But in data-driven testing, you connect your test to a data source. You can use many different data sources, from simple CSV files, through XML, and even full-featured databases like MySQL.
Choosing your data source
For simple scenarios, you can use a simple structured text or CSV file. This would work well if you want to test your login with a set of username + password tuples. More complex tests might need XML, so that you can add extra information to the test data. In large automated test suites, you may need to use a proper relational database like MySQL. This is particularly useful when you want to orchestrate your tests.
Connecting the data source
Once you have a data source, you need to link it with your test. With Selenium this can be quite easy. Let’s say you are writing a Selenium test in Python. You can simply include a step to import your test data from a CSV file. Then you create a loop that runs through each entry in the data source. For more complex scenarios, it may be better to use XML as a data source. Again, you can import this into your script and parse it to extract the data and the expected result. However, if you need to use a database source, this all gets much harder.
Analyzing the result
One of the big challenges is how to analyze the result. Sometimes, there is only a binary choice but often there are many possible outcomes. One way would be to use a case statement to compare the actual outcome with the expected outcome. If the result is more variable, you might want to use XML to provide a richer description of the expected outcome. At the end of the day, this is where your test automation engineers will show off their skill.
When should I use data-driven testing?
Data-driven testing is ideal in any scenario where the same test steps need to be repeated with different data. Typically, these are tests of application logic. For instance, you might want to test the registration flow as described above. Or you may be testing that your shopping cart logic is correct. Let’s look at a couple of scenarios that are ideal for DDT.
Shopping cart logic. eCommerce sites employ complex shopping cart logic. You need to test this carefully since any errors could be expensive. Firstly, the cart must correctly calculate the total cost. This means you need to test with multiple combinations of items. Secondly, you need to test the logic around special offers. For instance, “buy one get one free”. Thirdly, you need to test voucher codes properly, including any logic relating to validity. Finally, there’s usually some complex logic relating to delivery costs.
User management. You need to test your user management flows thoroughly. For instance, you need to test any user registration flow with valid and invalid data. You also need to test changes and updates to user data. Finally, you may need to test permissions. E.g. does a given user have permission to perform an action? If you change their permissions, does the system get updated correctly?
Localization. Most sites need to work across multiple regions with different languages and cultural norms. Even simple things like numbers vary significantly. For instance, in much of Europe, you write numbers and currencies in a completely different format (so $1,231.99 becomes 1.231,99$). And address formats change in every country as we mentioned at the start. This is known as localization. You need to test all these differences thoroughly.
Is data-driven testing the right choice?
Here are some simple guidelines for deciding whether you should use DDT.
- Do you repeat the same test steps several times in different tests? Typically, this is the case for many user flows within your UI.
- Are there obvious happy and sad paths you are trying to test? For instance, will one set of data trigger an error message and a different set not?
- Do you need to test multiple variations of the same data? When you are testing application logic, you need to be able to vary the input data.
- Will different data generate different outcomes for a test? For instance, login flows which will either login the user or not.
- Are multiple tests run in parallel on the same system? If so, you may want to ensure you use different data for each test.
If you can answer yes to most of these, then you should consider looking at data-driven testing.
When to avoid data-driven testing
Of course, there are some scenarios when data-driven testing isn’t suitable. These fall roughly into two groups.
Frequency of testing. If you are only repeating a test occasionally, it may not be worth setting up data-driven testing. Equally, if you usually use a single set of data but only occasionally vary it, then data-driven testing may not be worth doing.
Complexity of the result. Data-driven testing needs you to be able to specify the expected outcome for each set of test data. In many scenarios, the outcome varies so much that it becomes difficult to do this. This is especially true if the different outcomes bring up completely different page views.
There are many other reasons not to use data-driven testing. Your team may lack the skills and experience to set it up. You may be just starting to create tests for a new product. You may decide that the effort saved is not sufficient to make it worth doing. Or the tests you have simply may not benefit from this approach! At the end of the day, you have to assess tests case-by-case to see if it’s worth implementing data-driven testing. Alternatively, just use Functionize, which incorporates data driven testing thanks to its advanced test data management capabilities. To find out more, book a demo now.