Wednesday 25 June 2008

Test framework brief description

Here is a brief description what happens when running tests using my test automation framework (and some future plans).

Data model:
Each test suite has one or more test scenarios which has one or more test cases.
When running test suites, a unique test run is created which has one or more test case execution queues.
All data is in a MS SQL Server database.

In automation GUI:
1. Select run mode, New or Resume (If resume select a test run which is not completed).
2. Select one or more test suites if New.
3. For each test suite select test level, test days duration, test environment and test tool if New.
4. For each test tool instance, select global run options like reporting, e-mail etc.
5. Press run button for each test tool instance which is not aleady running.
6. Test run is started for each test tool instance via Scheduled tasks.

For each test tool:
7a. Execute each test case in queue until all is executed or cancelled (could happen if a prevoius nested test case fails).

In test run report GUI:
7b. Follow test run progress during run-time (now also reporting at step basis).

In test case dashboard GUI:
7c. Latest test case result are shown during run-time for each version och configuration.

Known limits:
Parallel execution is not implemented (you cannot use multiple test tool instances for a test run in order to decrease test execution time).

Supported tools:
QTP
TestBatchRunner (home made tool for running java batch jobs)
TestAPIRunner (home made tool for Web Service, XML API and http requests)

Future plans:
Implement support for more tools (thinking of trying Selenium).
Learn Java and OO (natural language choice since our AUT is written in Java and allowing for more collaboration with Development).
Rewrite test framework using OO-concept in Java (currently in VBScript and JavaScript) if Selenium works OK.

Wednesday 18 June 2008

Testing dashboard

I like dashboards...both using them as well as developing them. If based on proper metrics they give a good picture of the current status in a single view. However, finding good metrics in testing is not a trivial task since counting test cases is not a good idea (especially for those of us who mainly do exploratory testing instead of executing scripted test cases).

I took the "Rapid Software Testing" course by James Bach about a year ago and he introduced me to "The Dashboard Concept" (see slide 138 in RST Slides). We had at the time a simular concept in an Excel sheet where colors were changed based on certain keywords but Excel has some limits when it comes to concurrent usage.

Now we are evaluting the use of my latest creation: AUTDashboard.zip



I have removed the Effort column from James' dashboard and I will probably add some extra columns for latest changed date and other things later on.

The dashboard aims at showing the test and quality status for an AUT, based on subjective conclusions for all test activities in a release/project.

If want to try it out, download it and place the files in C:\ if you don't want edit the html file. You also have to answer yes to all security warnings in order to get it to work.

Friday 13 June 2008

Test automation report - source code

Here you can find the source code for my node report (see previous post): wwwroot.zip

I have enclosed an excel file instead of my database but if you take a look at the sheets you get some clues how my test case data model is like.

Place the files under c:\Inetpub\wwwroot\ if you want to run immediatly else you have to alter some hardcoded filepaths in the html-files.

If you have local IIS you open the report by this URL:

http://localhost/Index.html?1

where the number after the ? is input parameter which test run report to show.

If you are successful in installation the page should look like this:


The report uses both VBScript and JavaScript (client side, that is why security warning will occur) and the components enclosed is free to use.

Good luck! I am off sailing this weekend so don't expect any support for a few days :-)

Wednesday 4 June 2008

Test automation report

Ever since my first QTP crash back in early 2000 I have disliked the QTP native report. The layout and content is OK but the fact that the result is not available until after the test run (and if QTP did not crash) is not satisfying according to my reuqirements in automation.

So about a year ago I began to log test result outside QTP so the it becomes available during run-time. When switching from Excel to SQL Server I did some web GUIs so it would be easier to follow the test progress...and now my latest report is looking quite good I think (it reminds a bit of QTP's native):



The concept is simply like this:


1. Read selected data from db
2. Render frames based on the data using JavaScript and VBScript (the pie chart requires Flashplayer)
3. Press update or refresh to get the latest results (if the test is running)


It is based on my test case data model and I use a JS Tree to generate test case nodes:


If you click on a test case all related data is shown in the main frame as well as clicking on test suite node (one report can have one or more test suites).

Does this look interesting? I will make an deidentified code copy and publish it later. The report does not require QTP and can be used with any tool which allows custom reporting. Must of the job is to implement a good custom reporting (prefeble to a database) from your test tool and have a good test case data model.