drupal

Automated Testing System - Statistics

I decided to pull gather some statistics about the automated testing system. These statistics were collected on Wednesday, May 6, 2009 at 4:00 AM GMT. Automatic generation of these statistics along with analysis is a feature I have in mind for ATS 2.0. I appreciate donations to the chipin (right), as this project requires a lot of development time.

From the data you can see that the test slaves have been running tests for the equivalent of 200 days. The system has been running for 192 days and not all the data was included since some of it is inaccurate. That means the system has saved 200 days of developer's time! It is clear that the ATS is a vital part of test driven development. Additionally the time that would have been spent fixing regressions and new bugs has been drastically lowered.

Item Function Value
Time testing SUM 17,310,047 seconds (~288,500 minutes, ~200 days)
Test run (test suite) COUNT 42,351
MAX 3620 seconds (~60 minutes)
MIN 17 seconds
AVG 804 seconds (~ 13 minutes)
STDDEV_POP 783 seconds (~13 minutes)
Test (patch, times tested) COUNT 6,953
MAX 86
MIN 1
AVG 10
STDDEV_POP 15
Test pass count MAX 11,453
MIN 0
AVG 4,265
STDDEV_POP 4,910
Test fail count MAX 6,989
MIN 0
AVG 9
STDDEV_POP 155
Test exception count MAX 813,795
MIN 0
AVG 160
STDDEV_POP 9,893

One item you may notice is the maximum test exception count of 813,795. The patch that caused that many exceptions proved that our system is scalable! The patch is much appreciated. :)

Saved the current test result breakdown.

Result distribution

The average test run length for all the active test slaves can be seen below. This data is only looking at the latest test run for each patch in the system.

Test slave Average test length*
4 730 seconds
5 1,753 seconds
7 1,352 seconds
8 576 seconds
9 2,438 seconds
10 1,942 seconds
12 1,161 seconds
16 217 seconds

* Excludes test runs that do not pass initial checks and fail before running test suite.

Average Test Length

Automated Testing System 2.0 - New Features - Part 1

Over the next few weeks I plan to make a number of posts about the new features provided by ATS 2.0 and the benefits to the community. Currently, the system is in the final stages of deployment, but is not yet active. Please be aware that these features will be available once ATS 2.0 has been deployed. I appreciate donations to the chipin (right), as this project requires a lot of development time.

Server management

One of the major restraints holding back the expansion of the system has been the need to manually oversee the array of testing servers. The new system contains a number of enhancements to make it not only easier to manage the network, but also automates the task of adding new clients.

Client enable process
Upon request to enable client a set of error cases are sent to the client with expected results.

The most important addition that makes all this possible is the automatic client testing. Clients are automatically tested to ensure they are functioning properly. This is done through a set of tests that are sent to each test client with an expected result. The results the client sends back and compared with the expected result and that information is used to determine if the client is functioning properly. Clients are tested on a regular basis to ensure that they continue functioning as expected.

Another helpful change has been re-working the underlying architecture to use a pull based protocol instead of a push based protocol. This alleviates the issues caused when a client is unreachable for a period of time, or is removed without notice.

Public server queue

Add client
Very simple screen that allows users to add a test client to the network.

Another improvement that will facility a larger testing network is the public server queue. Allowing anyone to add a server to the network is possible since the clients are automatically tested as described above.

The interface has been designed so that users may control the set of machines that they have added to the network. The system automatically assigns the client a key that must be stored on the client and is used for authentication. The process of adding a client to the master list is very simple and should provide an easy way for users to donate servers.

If the system detects any issues with the client down the line, such as becoming out of date, it will notify the server administrator of the problem and disable the test client. The system will continually re-test the client and re-enable it automatically if it passes inspection. Alternatively, the server administrator may request the client to be tests immediately after fixing the issue.

Multiple database support

The new system has been abstracted to allow for the support of PostgreSQL and SQLite in addition to MySQL. This is vital to ensure that the Drupal 7 properly supports all three databases. Just as patches are not committed until they pass all the tests, patches will not be committed until after passing all the test on all three databases (5 environments with the database variations).

SimpleTest 6.x-2.8 - fresh backport

I maintain a backport of Drupal 7 SimpleTest for use with Drupal 6 as the SimpleTest module 2.x branch. A fair amount of work goes into maintaining the code even though it is a backport.

Tonight I finished up a fresh backport of Drupal 7 and made a release along with a number of other changes. Please update to SimpleTest 6.x-2.8 and report any issues you find.

Note to developers

Since this release includes a fresh backport it also contains an API change that you need to be aware of. All getInfo() methods need to be changed to static before using this release. If you are a module developer or maintain tests internally please make sure you update them.

Old

<?php
function getInfo() {
  return array(
    
'name' => t('[name]'),
    
'description' => t('[description]'),
    
'group' => t('[group]'),
  );
}
?>

New

<?php
public static function getInfo() {
  return array(
    
'name' => t('[name]'),
    
'description' => t('[description]'),
    
'group' => t('[group]'),
  );
}
?>

Automated Testing System 2.0 - Final Steps

During the last several months I put a substantial amount of work into improving the Automated Testing System. Future posts will describe the exciting new features and the benefits to the community. If interested, a brief overview of some of the requirements can be found in the PIFR and PIFT issue queues.

For additional background, the original thoughts can be found at the following links:

Final steps
There a number of steps that need to be completed before the Drupal community can reap the benefits of the new system.

  1. Security review of rewritten Project Issue File Test (PIFT) module that integrates with the project module on drupal.org.
  2. Someone familiar with SQLite, possibly one of the D7 maintainers, needs to write a PIFR DB driver to implement the required methods. MySQL and PostgresSQL have already been completed and can be used as examples. The driver is relatively simple, but will require manually connecting to SQLite since PIFR runs in D6 which does not support SQLite.
  3. Update testing client setup/installation script where necessary.
  4. Deploy current development system to project.drupal.org and the create a parallel testing client network.
  5. Freeze the current test client network and extract the test ID map for use in drupal.org upgrade.
  6. Upgrade and finalize test client network and test server (testing.drupal.org). Possibly move testing.drupal.org under the drupal.org infrastructure.
  7. Confirm upgraded testing network is functional.
  8. Plan for approximately 15 minutes of downtime on drupal.org.
  9. Update PIFT code and run data update using extracted test ID map from #4 on drupal.org during downtime.
  10. Watch deployed system closely and solicit community feedback and bug reports.
  11. Request additional hardware to use as community test clients (to allow for future expansion into testing contributed modules).

Future
Once the second generation framework is in place and running smoothly I will begin work on finishing the last pieces required to allow for testing of contributed modules (D6 and D7) and Drupal 6 core. I will be writing more on the new features and UX improvements to be looking for in during the upcoming deployment.

Summer of Code 2009 - Usability Testing Suite - Accepted

Google announced the accepted student proposals today, of which my proposal was included. The project will finalize the Usability Testing Suite which was started during Summer of Code 2008. At the conclusion of this year's Summer of Code I hope to have the project ready for prime time use which will make it easier to conduct a number of different usability studies.

For a bit more detail on the project I have included the project abstract.

The Usability Testing Suite that I created during SoC 2008 needs additional work in order to make it ready for use. The existing module provides a powerful API that makes writing data collection plug-ins simple, but the user interface needs refinement and a screen recording plug-in would make the module much more helpful. Through this project I intend to finalize the Usability Testing Suite and make it ready for widespread use.

On another note, my father was also accepted! As confirmed by Károly "chx" Négyesi we are the first father-son pair to be accepted into Summer of Code.

I have attached the full project proposal that I submitted.

File attachments: 

Drupalcon DC - Automated testing - Saving webchick time - the saga

I will be presenting Saving webchick time - the saga along with Kieran Lal at Drupalcon DC 2009. To quote the session abstract:

One of the major enhancements made to the Drupal development cycle has been the addition of a fully automated testing bot, built on the testing framework in Drupal 7.

This session will focus on automated testing as it relates to Drupal 7: its history and direction, the automated testing bot: what has gone into it and where the future leads, and most importantly what is the end gain to the Drupal community.

The framework has gone through a rather long and interesting history with a number of road-blocks and challenges that have been overcome. The session will tell the story of the framework and the benefits it provides to the community through the enhanced Drupal 7 development work-flow. Although the session will go into some technical details it will overall tell the story of the framework and where we plan to take it. The presentation should be interesting to most and provide a great time to throw out any comments or concerns.

After the presentation I will stick around to discuss our recent launch of the Boombatower Testing Service (BTS). If you have any questions, comments, or concerns please drop by.

SimpleTest never sleeps

I am sure by now most, if not all, are aware of the drupal.org upgrade to Drupal 6 and may have noticed some of the changes. After the upgrade was completed I went ahead and posted my updated hook_test() patch to Create hook_test(): move SimpleTest getInfo() out of test cases. Upon posting I noticed that the file was uploaded to:

http://drupal.org/files/

instead of

http://drupal.org/files/issues/

That may not sounds like much of an issue, but it has a number of trickle effects.

  • Files are not renamed properly. Meaning that files with the same name may exist, but in different directories.
  • Inconsistent data would be sent to testing.drupal.org that would have caused issues.
  • Confusing urls.

I talked with Chad "hunmonk" Phillips about it in IRC and discovered that is was due to the new File API. Drupal.org was put in maintenance mode, as many of you probably noticed, and Chad dove into the code. After discussing for a while a "fix" was created.

The moral of the story...yet another bug found due to SimpleTest (the patch I was posting was related to SimpleTest). That of course is not to say it would not have been found soon enough, but SimpleTest indirectly found it first. :)

Abstract SimpleTest browser holds many possibilities: install and update scripts

Situation
The SimpleTest browser that is included in the Drupal 7 core is very powerful and I have found myself hacking in order to use it outside of SimpleTest on a number of occasions, as have others. In addition the DrupalWebTestCase is becoming rather bloated. There is an issue to create pluggable backends for drupal_http_request which could be cleanly done by abstracting the SimpleTest browser. The change would create a clean and more powerful API as well as open up a number of possibilities by having a browser built into Drupal core and thus is something worth working on.

Solution
A while back I did some work towards abstracting the SimpleTest browser, not only out of DrupalWebTestCase, but improving/cleaning the code and writing it with a pluggable backend layer. The code is quite far along and just needs a bit more refinement before it can replace the existing SimpleTest browser. The next step would be to replace drupal_http_request or make it a wrapper. After that, areas of core that parse HTML and such can be cleaned up and simplified using the browser.

Possibilities
With a browser in core there are a number of powerful tools that can very easily be added. For instance install and update scripts that can trigger remote servers!! Think of having a single script on a development machine that runs the update script on all specified sites!

Installing via a script can be very usefull not only for the obvious, but also for modules like the Usability Testing Suite and Project Issue File Review that have to make separate installations of Drupal and currently must maintain their own install scripts. Having an abstracted installation script is necessary for testing.drupal.org to be able to function with patches that make changes to the installer.

I plan to work on this further, but have been busy with other projects lately and my recent video card experience. I would like to hear some feedback from the community and possibly recruit some help.

Related issues

Automated Testing System Development Clarification

After my most recent post concerning the Testbed design change and development I received a number of comments that lead me to believe that the post was misunderstood. The comments suggest that the readers believe my primary focus of development will be changing from a push to a pull model. The post seems to have been misleading in that regard.

The change to a pull architecture is a very minor change in terms of coding, only 2 functions are even effected, but in terms of managing the network it helps out a lot. The majority of the development, that I am raising $3000 for, will be focused on implementing the ideas described in The Future of Automated Patch Testing. The feature additions will:

  • Give more control to server administrators (donated servers)
  • Make it easier to manage the automated testing framework
  • Allow for testing of multiple environment configurations
  • Open up testing of Drupal 6 core and contrib code.

I hope this has cleared up any misconceptions with the automated testing system as it stands and my plans for its feature.

I appreciate any donations.

Pages

Subscribe to RSS - drupal