Part 1: The woes of the testbot

The intent of this series of posts is not to blame people, but rather to point out the testbot needs full-time attention. Integral to this story are the decisions and circumstances that led me to stop working on SimpleTest in core and the "testbot" which runs on I intend to follow-up this post with others dealing with rejuvenation of the testbot and improvements to SimpleTest. I understand some will not agree with my position, but I would like everyone to understand my reasons and intentions, and how we find ourselves in the current state of affairs. After everything is out in the open, my hope is that a useful discussion will ensue and meaningful progress will result.


Four factors led me to stop working on SimpleTest in core and the testbot:

  • I no longer had gratuitious amounts of free time.
  • I now had a need to make a living (and working on the testbot does not generate any income).
  • The core development process being what it is led to burnout and lack of desire.
  • The request to stop working on the testbot in conjunction with the Drupal 7 code freeze.

With me out of the picture, it magnified the fact that noone else worked on the testbot and, going forward, noone stepped up to take my place.


Lets start off with some background about my involvement with the Drupal testing story.

SimpleTest's journey to core

Rewind the clock back to early 2008. I had gotten involved in Drupal through GHOP and became maintainer of SimpleTest. I proceeded to perform a large-scale refactoring and cleaning up of SimpleTest. This, combined with other community efforts, resulted in SimpleTest being added to Drupal 7 core during the Paris Coding Sprint. The rapid pace at which I was able to develop SimpleTest quickly slowed as I no longer had the ability to commit changes nor make design decisions. Instead, even the most trivial changes took days or weeks to get committed. In spite of these additional challenges, I continued to diligently work on SimpleTest in core. To my dismay I discovered on multiple occasions that large changes were virtually impossible to push through the core queue, and I spent countless hours rerolling patches and refactoring code at various developers' whims. In the end, the patches simply died, but not for lack of quality or merit.

SimpleTest Transition to Core Commit Log
The chart shows 37 commits to the SimpleTest project before and after it was added to Drupal core. It is clear the pace of development slowed immediately and lessened further with time.

Changing course I focused on small changes to SimpleTest in core, but ran into similar throughput issues. For all intents and purposes, my ability to make contributions to SimpleTest had ground to a halt. This led me to write a blog post detailing the problem and possible solutions. I was not alone in my conclusions and many would still like to see the problem resolved. I continued to contribute to core now and then, but I was completely burned out. I even took month long breaks from Drupal as it literally burned me out to try to make any contribution to core. My burnout was not caused by overwork but was due to frustration with the exaggerated length of time to accomplish a minor commit.

Following up SimpleTest with the testbot

On a parallel track, getting SimpleTest into core turned out to be only half of the battle. Actually seeing the tests adopted and maintained remained a challenge. I led the charge to keep the tests in sync (initially doing it almost alone). The effort to create an automated system for running the tests had been underway for quite some time, but lacked the necessary volunteers and commitment to really get it off the ground. I was then asked to take over the project at which point I evaluated its status and decided to start over. I created PIFR, a plan for realizing the goal, and proceeded to rapidly make progress. launched shortly afterward and testing became an integral part of the Drupal core workflow.

With a working system I then laid plans for a second iteration of the testbot with a number of improvements. After heavy development the second generation of the testing system was launched with a massively improved feature set.

Seeking sponsors

After graduation from high school I was no longer financially able to devote large portions of my time to the testing system or core development so I sought sponsors to enable me to continue my work. Acquia provided an internship that allowed me to focus on testing again. After successfully completing the internship I found a job with that allowed me to spend a portion of my time improving and maintaining the automated testing system and roll out the initial work for contributed project testing and a number of other improvements in ATS (PIFR and PIFT) 2.2. The contributed project testing with dependencies was labeled beta because it did not support specific versions and had known issues. The plan was to make a followup release to solve the issues.

Code freeze and the request to stop

After deploying PIFR 2.2, I was asked to stop making changes to the testbot to ensure stability of the testing system during the final stages of Drupal 7 development. I continued to make improvements that I planned to deploy once the freeze was lifted, but the short freeze turned into months and more months. This delay ultimately forced me to stop development before the codebase diverged too much from the active testbot.

PIFR and PIFT commit log
The chart shows my combined commit activity for PIFR and PIFT and indicates the dramatic slowdown that occurred as a result of the freeze placed on the testbot.

During this time I was the only person who worked on the testbot in any significant capacity (or virtually at all). My availability for working on testing dwindled when my time with Examiner ended. This, combined with the stagnation forced upon the testbot, meant things simply ceased moving forward. The complete stagnation is seen in the long period of time between the 2.2 release and the 2.3 release of PIFR on January 28, 2010 and March 28, 2011, respectively. During that entire period of more than a year no changes were made to the testbot. When changes were finally made, they were done merely out of necessity to accommodate the git migration.

Post-freeze undeployed features

Shortly after the 2.2 release I completed a number of improvements before things came to a stand-still. Some of the recent deployments have included functionality that I had completed, most notably:

  • Version control system abstraction and plugins for bazaar, cvs, git, and svn
  • Coder reviews in addition to testing
  • Beta support for contributed project testing with dependencies

Recent changes

As mentioned above, I had already abstracted the version control handling in the testbot and had four plugins (bazaar, cvs, git, and svn). Unfortunately, there were a number of assumptions that had to be made due to limitations with the project module's VCS integration. These assumptions had to be updated for the shiny new version control API. The changes required were very minor and did not represent any feature improvements, but were simply part of the changes necessary to complete the git migration. Randy Fay made the necessary changes and the testbot saw its first update in a very long time. A few small followups were released as part of the planned phasing out of the old patch format and such. It is interesting to note the other major components of the migration were contracted by the Drupal Association except the automated testing system.

Jeremy Thorson has recently been working on using the testbot's ability to perform coder reviews to help solve the woefully broken project application process which he describes in several blog posts. Again we see change coming to the testbot out of necessity rather than a focused plan for improvement. For those not aware of it, the project application queue has several hundred applications and it takes months to even receive a review. Jeremy has worked hard on improving the application process, at the heart of which is the ability to perform automated coder reviews. Providing automated reviews has been held back on multiple fronts not the least of which is finding people to get things done. This is a definite hurdle considering that only three people have every worked on the testbot code itself not to mention there is an average of less than one active maintainer at any give time.

As mentioned above, I had deployed the first stage of contributed project testing over a year ago, but was forced to shelve the follow-up deployments. The code to properly handle module dependencies fell into disarray with the git migration and required refactoring to work with the version control API. Derek Wright and I spent a lot of time hashing out the details to ensure things were properly abstracted for the project module. I completed the code, but it was never committed and thus was not maintained through the migration. Randy took it upon himself to update the code, but deviated from the agreed upon design. This choice meant the code would not be included in the project module and has a number of other ramifications. The feature was rebuilt in a specific manner that precludes others from taking advantage of the code and eliminates the possibility of exposing the data through the update XML information. Exposing the data in that fashion would mean projects like drush, drush make, Aegir and others could discard code written to recreate this data or would now be able to support proper dependency handling. In addition, the recent deployment of dependency handling has led to large delays and instability in the testbot.


The decision to freeze the testbot in conjunction with the Drupal 7 code freeze made sense at the time. However, the extended freeze of the testbot (due to the extended Drupal 7 code freeze) along with moving SimpleTest into core had the unintended and disappointing side effect of causing the effective stagnation of the testing system. The only changes to the testbot in the past 20 months have been made out of necessity and annoyance (the git migration and the unfinished testbot integration with the project application process for new developers). During my tenure with, a fair number of changes were made to the testing system but not deployed on The module dependency code had been written over a year ago and finalized shortly thereafter but languished and was never deployed. Recently, some of these changes were finally deployed along with the git migration. All the while, I had set forth a detailed roadmap for the testing system.

The testing system had been stable and running for 3 years. Recent changes (implemented by others) have resulted in the ups and downs of the testing system. The importance of testing to Drupal development coupled with the recent instability strongly suggests the testing system requires full-time attention. The lack of feature changes since the 2.2 release of PIFR in January, 2010 is a direct result of a lack of financial testing resources, the lock-down of the testing system components, the burnout caused by extreme difficulty to make changes, and the extended freeze placed on the testbot.

Various solutions were tried to enable the continuation of work on the testbot. None represented a viable long-term solution. In the end, my father and I decided the solution was to establish a business to advance testing for the Drupal community and to create an environment where we no longer have our hands tied behind our back. In the next post, I will share the vision and passion we have for testing along with several features that could be made available to the community immediately.

General quality assurance roadmap and request for help

As you may have seen in my previous post I have been getting back into the swing of things after some time away. I have a number of projects that I need to attend to, but as always not enough time to work on them all. Bellow you will find a list of my most important projects that need attention. If you have time available and an interest in helping out I would appreciate any extra hands. A number of the items do not have links since I have yet to spec them out completely, but I will update this post as I fill them in.

  • Parse project .info files: present module list and dependency information - needed in order for contrib testing to work 100% of the time and to move it out of beta so anyone can enable testing of their project.
  • PIFT 2.3 ( side of - In order to make the most of the qa framework a number of improvements need to be made, mainly options and information display on so everyone can take advantage of all the features the system already provides.
  • PIFR 2.3 ( and clients) - A fair amount of testing of the current development code base, a few more bug fixes, and features are needed to round out the next release.
  • A few remaining cleanup items for SimpleTest in Drupal 7 core.
  • SimpleTest 7.x-2.x - The branch will be used for continued development during Drupal 7 life cycle and already contains a number of improvements over the base Drupal 7 testing system. It was designed so that it can coincide with the Drupal cores framework and allow for tests to use either framework (one framework per module). The switch module, which no longer works (need to look into), allows you to switch between the contrib and core versions with a single click. The switch module also could be used by other projects attempting to replace core modules and can do so with a small amount of abstraction. With the next PIFT/PIFR release I plan to support projects that wish to use the 7.x-2.x API for testing.
  • Code coverage - Reports for all tests (patches or at least core/contrib commits).
  • Refactor/Rewrite SimpleTest for Drupal 8 - I have an extremely detailed roadmap in my head (and some paper notes) and a small code base that needs to be made public.
  • PIFR 3.x - Same as above, lots of plans, but not very much publicly available yet.

Feel free to read through the issues, information, and code and let me know if you need any help getting started. Thanks!

Drupalcon SF - Quality assurance thoughts

Before I get to the actual body of this post I would like to give an explanation for my somewhat distant behavior the last month or so since Druaplcon SF and the reason for this post being so long in the making. I have been going through some life changes and issues that have required most of my attention and left me with little to time for the Drupal community. I have resolved the issues that were consuming my time and I am looking forward to picking up where I left off. Hopefully, you will see a lot more activity from me in the near future.

We had an educational discussion during the quality assurance break-out session at the Core Developer Summit. During the session we discussed the following topics.

  • JavaScript testing for Drupal
  • Site-builder testing tools
  • Drupal core performance tests
  • Ensuring its easy to start testing

I was charged with leading the discussion and taking notes. The following are my notes of the conversation that took place during the session.

  • JavaScript testing for Drupal
    • Use testswarm to crowd source JavaScript testing.
    • Either, test HEAD/branches only and do so against tagged versions or wait for a single browser result to come back and do on patches.
    • Determine list of browsers/configurations we official support and that must pass JavaScript tests.
    • Look into themes breaking JavaScript, possibly run core JavaScript tests against contributed themes.
    • Selenium seems to have limited run-ability.
  • Site-builder testing tools
    • Provide base set of tests to ensure a Drupal server is well.
    • Provides the ability to run tests against non-Drupal sites which can be useful when porting sites, working with sites that are not entirely written in Drupal, and for checking third-party integrations.
    • Maintain site-builder tools in 7.x-2.x branch of SimpleTest in contrib.
    • Possibly provide a slimmed down version of SimpleTest for use outside of Drupal.
  • Drupal core performance tests
    • Does not have to be complicated, bug simply provide a consistent benchmark.
    • Something like loading several URLs a number of times on the same server and configuration.
    • Have a scripted setup containing lots of content on server.
    • Provides another use-case for extracting the SimpleTest browser for use in core and elsewhere.
    • Simple graph of performance over time.
    • Possible initial performance suite
      • View several anonymous pages
      • Login
      • Create a node
      • Make a comment
      • View several administration screens
      • Load modules page (historically one of the slowest)
      • Logout
  • Ensuring its easy to start testing
    • Use Selenium IDE combined with simpletest_selenium to make it easy to create basic tests.
    • Submit native Selenium IDE output with bug reports to make it simple for developers to re-create bug and check if bug still exists.
    • Could also be used by experienced developers to create a good basis for a test.


After letting everything digest I have a number of thoughts regarding the discussion and ideas that were presented as well as a few additional pieces of information. First of all I want to share my thoughts on JavaScript testing, as I am not sure I was able to properly present this idea in person.

I look at JavaScript testing the same way I look at the current PHP based testing we do. We assume a number of things work and are tested by other organizations. As such we do not duplicate those testing efforts ourselves which is a wise decision. What I mean by that is we assume the PHP language to work as expected, the SQL language and database engine to work, and a number of other components to function. No where in our testing system do we attempt to ensure that the PHP language constructs behave as they should. We should treat jQuery as the language that it is and assume, just as with PHP, that the language functions properly in all supported environments.

The implications of the above may not be clear. What the above implies is that we do not spend time ensuring that our components and JavaScript interactions function in all browsers, environments, and operating systems. Instead we leave that job to the folks at jQuery whom already do extensive testing. Drupal should focus on ensuring that the widgets/components that core provides, such as the form API autocomplete and the ctools framework function properly. This means that we can use a tool like Crowbar or Webkit to interpret the JavaScript/jQuery and run our tests in that manor.

Attempting to test our JavaScript implementation on the infinite number of environments that exist adds a large amount of complexity to our work-flow and, as far as I can tell, very little gain. Unless someone can come up with a solid reason why we need to run our JavaScript tests on lots of environments I do not feel the idea is worth any more consideration. Oddly enough the proponents of it seemed willing to delude to waiting for one environment to return before reporting the results on It seems we have a lot of interest around the idea with little concern given to the implementation and cost vs benefit.

I propose we evaluate JavaScript testing frameworks with this in mind. We also need to be aware that we do not need to re-test the whole of Drupal using a JavaScript testing framework. On the contrary we need to ensure that our components and interactions work in a generic form and leave the actual testing of the final operations such as submitting a form to the already existing PHP tests. Maintaining two suites of tests that cover the same ground would be a big mistake that I hope we do not make.

Selenium vs Drupal Testing

To solidify this point further let's compare the popular JavaScript testing framework Selenium to our current PHP testing framework. After you boil down the features and purpose of the two systems you discover that they both focus on the same key ability, that being to submit forms and perform actions as a user would. The area that Selenium allows us to test that our current system does not is in regards to JavaScript interpretation, while our current system allows us to test the PHP API directly, interact with the database, and even perform unit testing. So in order to give ourselves a fully rounded test framework we simply need to fill in the small bit that our current system does not give us.

More specifically, we need to be able to test our JavaScript behaviors and components built on top of jQuery. Something more along the lines of qUnit seems appropriate since it focuses on doing just that. We will most likely develop some wrapper code for Drupal specific things, but the library provides us with a much closer starting point. There is already a patch that takes us most of the way.


The site-builder tools discussed will be maintained in the SimpleTest 7.x-2.x branch and hopefully committed to Drupal 8 once development has begun. I will continue to work on improving the tools provided to site-builders in regards to testing in the 2.x branch and will also provide back-ports of these tools to the SimpleTest 6.x-2.x branch. Since these tools are relatively new I appreciate feedback.

Drupal quality assurance requires a decision

Many of you may be familiar with my previous post Diaries of a Core Developer and that sadly nothing has changed. We had a lot of good discussion in the comments, but as far as I understand nothing has been done to improve the development workflow.

Drupal QA has and is currently having issues due to a complex problem with the core design of SimpleTest. I documented this problem some time ago, and made several attempts to refactor SimpleTest (before and after). The patches were quite large since they did a proper overhaul of the system. I was told to break them into smaller pieces so they would be easier to review, but after discovering how intertwined the code was and not receiving any other real feedback I was burned out after a week of working on the patch (and the lack of time others were willing to spend).

Just as the database layer and other large changes had large patches and lots of follow-up patches, even SimpleTest itself was introduced in a large patch, so does this refactoring. I briefly described what needs to be done and some of the benefits in the original issue. You can find more details on the refactoring in one of my later consolidation issues in which I attempted to explain the design and work through the issue.

As I explained in the issue there are some hackish ways we can work around the problem for the time being. I am happy to implement those fixes, given that I am not wasting my time. I still believe an overall refactoring is in order to fix a number of problems including proper isolation of test processes and a clean way of gather errors.

Before I commit a large amount of time, again, to refactoring the system I would like confirmation that someone will actually review the patch, those involved are fine with the changes, and that we will see this through to be committed (preferably some assurance from Dries of webchick). Of course this leads back the fact that I have no authority as a maintainer to make this crucial decision and that the core maintainers are bogged down reviewing ALL patches, instead of distributing the load to the sub-maintainers and core maintainers reviewing sub-maintainers patches and overall changes.

Fresh SimpleTest backport - 2.9


  • the last official backport was April 23 2009
  • a number of very cool features have been added in core in addition to incremental changes
  • people have been asking
  • Drupal 7 is in "code slush"

it seems appropriate to perform another backport. I finished the bulk of the backport during Drupalcon Paris and have been tweaking and fixing bugs from feedback since then.

In order to ensure that all the new features from Drupal 7 were available it was necessary to create a patch against Drupal 6 core which needs to be applied before installation, as described in INSTALL.txt.

Please update and report any issues in the queue and have fun with the new features and proper error reporting!

test run

Drupal 7: debug() and SimpleTest->verbose()

Recently, I have made two major improvements to debugging in Drupal 7, the addition of a general debug function and a verbose mode for SimpleTest. The two additions make it much easier to debug problems quickly through the use of a consistent method. Take a look at what chx said via twitter:

Writing #drupal code? Check the new function debug(). Writing #drupal tests? Check $this->verbose(). And debug() works too. AWESOME!

General debug function
The general debug function can be used at any point after Drupal is bootstrapped, although the limitation may be removed in the future. The function provides a very simple wrapper to dump data through the use of var_export() and print_r(). When used normally it will display data based on the "Logging and errors" settings provided in Drupal 7 core. If using a dev version the debug information will be displayed using drupal_set_message() as shown in the screenshot below.

debug normal

The exciting part about the new debug() function is that it also works during testing. The debug() function can be placed inside the test itself or in any other part of Drupal and it will be picked up and displayed in the test results as shown below.

debug test

SimpleTest verbose mode
Another exciting new debugging tool that is extremely useful when writing tests is the new verbose mode for Drupal 7 SimpleTest. The verbose mode can be enabled on the SimpleTest settings page.

verbose setting

Once enabled SimpleTest will automatically record the page as it was seen by the SimpleTest browser after each drupalGet() and drupalPost() call. A link is then placed in the test results that will display the page the browser saw and some meta data related to the request.

verbose link

Page 1

verbose page1

Page 2

verbose page2

Manual verbose
In addition to the automatic message provided by SimpleTest custom verbose data may be dumped using DrupalWebTestCase->verbose() which can be used in a test as shown.


If the data to be dumped in not available in the test, but in the code being tested a function is provided that may be accessed by including the DrupalWebTestCase as shown below.

require_once drupal_get_path('module''simpletest') . '/drupal_web_test_case.php';

By adding these debugging tools to Drupal 7 the developer experience involved in writing a test has been greatly improved. These methods can still be improved and as such please feel free to file issues in the Drupal 7 SimpleTest issue queue. Also note that this work was sponsored by Acquia as part of my Summer Internship.

Acquia internship

Acquia logo

This Summer I will be working part-time as an intern for Acquia. I am very excited to be working with Acquia and having the chance to spend more time improving things that I have interest in. To clarify I will be working on projects that benefit the entire Drupal community. The items I will be working on are improvements to projects I have either started or that I am heavily involved with.

During the discussion of the internship I came up with the following goals that were then prioritized by Dries.

Primary goals

  • Finalize testing of contributed modules and Drupal 6.x projects/core.
  • Add executive summary of test results on project page.
  • Extend the SimpleTest framework so we can test the installer and update/upgrade system.
  • Improve and organize SimpleTest documentation
  • Work on general enhancement of Drupal 7 SimpleTest.

Secondary goals

  • Provide on-demand patch testing environment for human review of patches.
  • Finish refactoring of SimpleTest to allow for a clean implementation of "configuration" testing.
  • Analyze current test quality and code coverage, and foster work in areas requiring attention.

I will post updates on some of the more interesting items as they are accomplished. Additionally, I would like to give a special thanks to Kieran Lal for his mentoring and help in finding me sponsorship.

To clarify I will still be participating in Google Summer of Code 2009, which was explicit in my agreement with Acquia.
Follow up post by Dries.

SimpleTest 6.x-2.8 - fresh backport

I maintain a backport of Drupal 7 SimpleTest for use with Drupal 6 as the SimpleTest module 2.x branch. A fair amount of work goes into maintaining the code even though it is a backport.

Tonight I finished up a fresh backport of Drupal 7 and made a release along with a number of other changes. Please update to SimpleTest 6.x-2.8 and report any issues you find.

Note to developers

Since this release includes a fresh backport it also contains an API change that you need to be aware of. All getInfo() methods need to be changed to static before using this release. If you are a module developer or maintain tests internally please make sure you update them.


function getInfo() {
  return array(
'name' => t('[name]'),
'description' => t('[description]'),
'group' => t('[group]'),


public static function getInfo() {
  return array(
'name' => t('[name]'),
'description' => t('[description]'),
'group' => t('[group]'),

Drupalcon DC - Automated testing - Saving webchick time - the saga

I will be presenting Saving webchick time - the saga along with Kieran Lal at Drupalcon DC 2009. To quote the session abstract:

One of the major enhancements made to the Drupal development cycle has been the addition of a fully automated testing bot, built on the testing framework in Drupal 7.

This session will focus on automated testing as it relates to Drupal 7: its history and direction, the automated testing bot: what has gone into it and where the future leads, and most importantly what is the end gain to the Drupal community.

The framework has gone through a rather long and interesting history with a number of road-blocks and challenges that have been overcome. The session will tell the story of the framework and the benefits it provides to the community through the enhanced Drupal 7 development work-flow. Although the session will go into some technical details it will overall tell the story of the framework and where we plan to take it. The presentation should be interesting to most and provide a great time to throw out any comments or concerns.

After the presentation I will stick around to discuss our recent launch of the Boombatower Testing Service (BTS). If you have any questions, comments, or concerns please drop by.


Subscribe to RSS - simpletest