The Agile Tribe

Teaching an old dog new tricks – Trick No. 3

on November 25, 2009

Author: Nic Woollett

As part of our ongoing Agile journey in the mainframe/legacy environment, Acceptance testing is next in our list of our three ‘tricks’.

Acceptance testing was another big change to the way we normally operate in a mainframe/legacy environment. Acceptance testing was fully automated by using scripts: scripts run for each day to perform online maintenance, create files for the batch and then to validate the data post update. We actually developed a new scripting tool that enabled the testers to create test scenarios in a easy to use interface. The scripting tool then build executable scripts from the tester generated test scenarios.

When using scripts we required a testing region that could be restored back to a base point for each acceptance test run. When a test is built it will be run in every acceptance cycle thereafter – ensuring constant regression testing. Again, this offers a level of robustness above the standard testing approach where after a test is performed & accepted it is ticked off & (more often that not) forgotten.


Despite some initial scepticism that Agile could not work on a purely mainframe project – what with moving targets, short iterations and aggressive delivery times – we agreed from the start that we would stick as closely as possible to pure Agile. Whilst we are not LIVE yet I don’t think we would be as far along as we are now under waterfall and I believe the product we install will be much more robust under Agile.

What I learnt:

Formal system and user testing have both proven to be ideal subjects for automation.


One response to “Teaching an old dog new tricks – Trick No. 3

  1. Cara Talbot says:

    From a web app development aspect too I think the Agile testing approach with increased collaboration, usability sessions and automated testing, it’s taken a heck of a lot of pressure off the old final gateway of ‘User Acceptance Testing’ sessions run from the more traditionally run projects. Stakeholders are less anxious at the completion of the final sprint, as they’ve been reassured along the way that needs have been met, and continually retested.

    Our final ‘UAT’ session for a recent project has become more of a tool to review feedback and trends alongside the iteration usability testing sessions: has user experience improved? are workflows quicker as a result of feedback?

    I think this has proved far more invaluable than the older more traditional and somewhat clunky UAT sessions where it’s often too late to make any significant changes that users may unearth at the eleventh hour. It’s definitely helped us to avoid costly last minute pieces of rework, and provides much more confidence in going live that we’re delivering something that users actually want and are willing to use.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s