We live in an agile world, where features are developed and released incrementally based on growing customer needs. Automated tests have become a key component in this process to ensure defects are found as soon as a new feature is developed and the application code is checked in. These automated tests could be a mixture of a unit, API and UI-level tests. They have kicked off automatically as part of the CI/CD pipeline, which has become mandatory in environments with rapid release cycles.
Most organizations realize the importance of automated tests, and a considerable amount of time and effort is spent in hiring skilled engineers to write these tests. But once the tests are written, teams have challenges figuring out how to integrate them into the existing CI/CD pipeline. They need to identify what tests to run, when to run them and how to structure them to get immediate feedback on the features developed.
One approach to aid in this overall process is to categorize the tests and run them periodically during the different phases of the development process.
This diagram can help testers understand how automation fits into the various phases of the agile process:
There are typically three types of automated tests that run in a CI/CD pipeline.
High-level smoke tests
These high-level automated tests run on every code check-in to ensure the critical functionality of the system is still working as expected. This could be a mixture of UI, API and unit tests. The objective of these tests is to get quick feedback about the system, and they usually should finish running within five to ten minutes.
Daily regression tests
Daily regression tests ensure the new functionalities that were added to the system did not break existing ones. They are more detailed than smoke tests in terms of covering end-to-end flows of the system. They are usually run at least once a day and probably multiple times before a release.
Sprint-level tests
These are new automated tests that are written as part of the sprint. When a new test is complete during the sprint, it gets added to the daily regression tests. They are later merged into the regression test suite, usually at the regression testing phase.
In the regression testing phase, all of the above automated tests are still running in parallel with manual testing. In the acceptance testing phase, the teams have high-level acceptance test plans that are executed to ensure the critical functionalities of the system are still working as expected after the new features have been merged into the main code branch. The smoke and regression tests are also run again in parallel.
Throughout this process, teams continue to do manual scripted testing, exploratory testing, and risk-based testing, depending on the project timeline and resources available in conjunction with automation.
Writing automated tests is just one aspect of test automation. Teams need to start thinking on a much broader scale in terms of getting maximum coverage and quicker feedback for the time and effort spent on writing these tests.
All the approaches described above become more relevant as teams try to release faster by implementing a seamless CI/CD pipeline.
Some teams may be bigger than others, the applications could be more complex, and organizations can range in size from huge companies to startups. But no matter what, the process still remains the same: Teams need to collaborate and figure out a set cadence to organize and run the automated tests as part of their CI/CD pipeline.
There are more varied facets to test automation than just writing code. Apply these principles based on your context.