Automated UI testing is one of the biggest investments companies make as part of the testing process. The investment pays off, though, because these tests are easier to create for both technical and nontechnical testers, as various tools can aid in the process. These types of tests also simulate real user experiences via the user interface and validate different functionalities.
But many companies are hesitant to implement automated UI tests because they tend to get slower and brittle as the number of tests increases. How can you make the test suite faster, leaner and more stable?
Here are five practices that can build a robust, reliable, and faster suite of UI automated tests.
Use explicit waits
There are different ways to make the UI test wait for a particular element, such as Thread.Sleep() or implicit, explicit and fluent waits. A good practice to optimize the waits is to use explicit waits as much as possible, as it makes the test wait explicitly based on the occurrence of certain conditions. This speeds up tests because they are not waiting for an indefinite amount of time, like in Thread.Sleep() or implicit waits.
Conditional waits like explicit waits help to make the test smarter and much faster.
Run tests in parallel
Organizations want quick feedback about the application under test before it is released. In the past, teams had one server machine and ran all their tests in this single server. Now, times have changed, and there are various options available to run more tests, faster and in parallel.
There are open source solutions like Selenium Grid and third-party tools such as SauceLabs, BrowserStack and Ranorex that can run multiple tests in parallel and in the cloud. There is no need for a physical machine anymore.
3. Create granular tests
Every test you create should have a single responsibility and test a single feature. This is one of the best practices of writing stable tests. It is bad test design when you try to validate multiple functionalities within one test, because they bulk up the tests and make troubleshooting and maintenance a nightmare in the long run.
Structure your tests so that the object definitions and the implementations are separated. For example, you can use the Page Object Model to separate the IDs of elements from the actual tests and call the IDs during run time. It makes the tests more reusable and maintainable.
If you’re using Ranorex Studio to automate your tests, the Object Repository handles this for you. To learn how it works, watch our video tutorial below.
4. Focus on the best location strategy
One of the biggest reasons UI tests are brittle is that teams do not pay attention to the location strategy they use to find elements on the page. The easiest way to locate an element is using the XPath, but it is also one of the worst location strategies, as they keep changing based on the DOM structure. So they tend to be unreliable and should be used only when there are no other options.
A good rule of thumb is to use the below locators, which are more stable than XPath:
- ID (but avoid using dynamic IDs)
- Class name
- Tag name
- Link text
- Partial link text
- CSS selector
Ranorex Studio uses a stable, proprietary location strategy, the RanoreXPath, which offers a robust, flexible, and reliable way to locate elements.
5. Remember that not everything has to go through the UI
The more times you open up a UI to run different test scenarios, the more time-consuming it gets for tests to complete and get feedback on the application. Not everything has to go through the UI; you can interact with the page underneath the hood without needing to open the webpage.
There are tools like Headless Chrome, PhantomJS, Zombie.js, HtmlUnit, Watir-webdriver and many more to validate page elements through the DOM without needing to visually launch the web browser. This drastically reduces the execution time of web UI tests. Try to have a combination of UI and headless tests in your overall UI testing strategy.
Following the above strategies will help to make your UI tests much faster, leaner and more stable.
All-in-one Test Automation
Cross-Technology | Cross-Device | Cross-Platform