Advanced image-based automation
There are situations where image-based automation is basically the solution to the current testing challenge, but in practice, fails if not specified and/or configured in more detail.
If – in any case – an image to be identified, is graphically (optically) modified the image-based automation with default settings and configuration will fail. The modification can be through screen activation (=highlighting), or any other form of graphic change. Other challenges might be that the reference image is not unique, and alternative (i.e. wrong) images are identified instead of the reference images.
This section shows how to configure and specify image-based automation so that it still works. Whereas text-based automation is clear and simple in its application, image-based automation sometimes needs creativity and the knowledge of how to solve an arising problem.
In this chapter
Image not found improvement
Sometimes, minor changes in UI-elements, such as differently colored background, or font color may lead to test run failures. The challenge is that an UI-element needs to be identified correctly independent of a minor change in graphical representation. This chapter outlines and explains how to do this.
Test example recording
Follow the instructions below to record a test with a ‘built-in’ test run failure.
Image not found – test example preparation
Track & identify an example date in the calendar view
See the resulting action table with the recorded action item
Test run & test result
Now let’s see the result with a ‘normal’ test run and the result when the corresponding UI-element is graphically slightly changed.
Image not found – test run
Success test run
- The test run with the non-modified calendar view leads to predicted successful test run
- See the success test report with the correctly identified UI-element
Failure test run
- The UI-element is graphically slightly changed – clicking it gives it a grey background
- The UI-element is not as it originally was tracked and identified, therefore, the test run fails
- With the default image-based settings, Ranorex is not able to identify the changed UI-element
Initially tracked & identified UI-element and changed UI-element
Failure message in test report
- Changed image with grey background cannot be found
- Test run is aborted
It is necessary to change image-based settings to solve the problem. Therefore, open the action properties and follow the instructions below.
Opening and changing image-based property settings
Open the drop-down list of the Image-based properties
Seeting image-based properties
- The preprocessing options implement several graphics filters
- These filters make the UI-detection immune against color (change), sharpness, and other changes in graphic appearance
- Select the filters as shown in the image
- Pre-processing steps remove some of the image differences, but not all of them
- Therefore, it is necessary to remove the default 100% similarity match
- Set the similarity factor as shown in the image
The various image-based properties like similarity factor and graphic filters are introduced and explained in detail in > Ranorex Studio advanced > Image-based automation > ⇢ Image-based properties.
Improved test run
Now re-run the test and see the improved result with the successful test report.
Improved test run with successful test report
Modified calendar view with date and grey background
Success test report with successfully identified UI-element
Wrong image found improvement
In some cases, image-based automation leads to curious effects. One of those and its solution is introduced and discussed herein.
Test example preparation & recording
Assume the tracking and identification of a calendar date in calendar view of the demo application. And further assume that the initially tracked and recorded UI-element cannot be found anymore, but an alternative, wrong UI-element is found instead.
Wrong image found – test example preparation
Track & record a calendar date in the current calendar view of the demo application
See the image-based selection of the date with the surrounding pink frame
The result is an action item in the action table
Resulting test run
Let’s assume that the initially recorded UI-element is graphically different from the UI-element to be identified during the test run.
Wrong image found – test run
May 7th is the initially tracked & identified UI-element of the calendar view. Compared to the recording it now has a grey background and therefore cannot be identified by Ranorex
Instead, Ranorex searches for similar UI-elements and finds an occurance at the bottom of the calendar view within the definition of Today: 5/7/2018
The solution is the refinement of the image-based object selection. This is done by means of the built-in image editor.
Image-based object selection refinement
Default image-based selection The default image-based selection usually surrounds the object very closely. Therefore, the alternative match is found.
Object selection refinement Manually refining the image-based selection supports to identify the desired object and eliminate the identification of alternative similar UI-elements
The image-based selection can be refined and manually set by means of the built-in editor.
Image-based selection refinement with image editor
Select the action item and click the button for opening the image editor
Refine and modify the image-based selection manually
Click OK to confirm the selection
Automated testing based on the detection of UI-elements by means of text or other verifiable attributes is straight-forward. Image-based automation, on the other hand, is sometimes experimental and needs a good portion of experience, knowledge, and creativity in solving problems and facing challenges. Image-based automation works well if you follow a few but useful recommendations:
Try to identify a unique as-possible image, or sub-image with no, or few possibilities of alternative matches, or similar sub-images.
Make your reference image as small as possible and as large as necessary. This may require some testing and experimenting.
Never apply a similarity factor of 1.0 unless you really want to test for an exact match! Set the factor as close to 0.95 as necessary, but as close as possible to 0.99. In fact, a similarity factor of 0.98, or 0.99 should work in most of your test challenges.
Improve your image-based selection mechanisms with pre-processing properties.