July 16th, 2006

UI Testing Concerns

The Visual C++ Integrated Development Environment (IDE) includes an assortment of features including the VC++ Project System, Browsing and Intellisense, the debugger, Resource Editor, code and project wizards, and more.  These features are expressed in the product to the user through the UI and/or backend engine level components.  In testing these features, QA must employ various techniques to cover both the UI and object model areas of feature exposure.  In this post I’m going to discuss the many issues QA face when developing and using automation frameworks for testing the UI portion of our features.

 

In the Developer Division much emphasis is put on high quality and advanced QA testing techniques and tools.  Many of our QA tools have large numbers of contributors and employ software development techniques rivaling many companies in consumer software development.  In developing these large scale testing tools, extensive consideration must be made during design and development to accommodate for the wide range of concerns the tool will address.  Testing UI through automation can pose many issues that must be addressed in order to have a reliable and useful test system.  Major areas of concern can be divided into several major areas:

 

Automation layer development and maintenance:

 

Consideration must be made as to how long this tool and/or libraries will be expected to stay in use.  Many “great” automation support layers have not made it past one version release of the product.  Maintenance is also a large concern, how well will this automation run on the next version of Windows?

 

UI of the product can be very volatile during development, how well will the automation support layers respond to this?  Writing support layers on top of the core automation components can aide in minimal changes to automation in response to product changes.

 

Speed and ease of test development.  The automation testing framework must be easy to use and provide sufficient documentation for anyone in the QA organization to readily ramp up and write tests.

 

Support layers must be designed in a way that when tests fail, the failing cause is easily reproducible through logs and/or running the test under the debugger.

 

Environments where the automation will run:

 

            Testing through the UI provides a much larger matrix of testing that command line or engine based tests.

 

Product SKUs/Editions.  An automated test will need to run against any different SKUs which the product ships in.  This became a large issue for us during the last product cycle when we added the Express SKU, UI layout was different in cases as well as other differences like process name and exe name.

 

Product language.  Visual Studio ships in many languages and our automation runs on all of them.  Extensive use of resource string extraction is required here.

 

Various ranges of processor architectures (x86, x64, IA64) and OS’s (WinXP, Win2k3, etc).  Many issues can arise here consisting of anything from changes in paths (c:Program Files vs. c:Program Files (x86) ) to differences in OS windows drawing components.

 

Install of the product you are testing can vary.  During the last product cycle of Visual Studio we had a standard install or layout install and also a batch install (primarily used by developers).  Install paths and locations of files differed between the two installs causing more potential issues in automation.

 

Test runtime:

 

Automation must run in an extremely robust fashion if test run results are to be trusted and given validity in judging the quality of the product.  The single largest runtime concern in producing reliable tests is timing.  Making sure an automated test will run equally well on an extremely slow machine and a cutting edge fast machine is a very difficult task.  The key here is to create test waiting methods that don’t w ait any longer than necessary as we are always limited by hardware quantity and time.

 

Another major concern during runtime is asynchronous handling of dialogs.  As our lab machines are in constant use, at any given time the network IT software might pop-up a dialog box forcing an OS update or on debug versions of the product assert dialogs might come up.  The test needs to handle these unexpected dialogs gracefully and dismiss or take necessary action then continue on with the test.

  

As one can see, there are many concerns in writing effective and robust automation that runs through the UI.  We spend a lot of time and effort on creating and maintaining QA tools to aide in UI test development.  I’m always happy to discuss UI automation ideas and techniques.  If the demand is there, I could possibly continue to post on ways we have gone about solving each of the above problems in our automation frameworks.

 

Forest W. Gouin

Visual C++ IDE

forest.gouin@microsoft.com

Category
C++

0 comments

Discussion are closed.