Prompted by Simon Morley's blog post about showing your thinking, I thought I'd show my thinking on something one of his commenters mentioned - a testing playbook.

Let's set the scene:
I work (currently) in a scripted shop. Lots of procedural manual test scripts, waterfall delivery process. It's long bothered me that we have such a colossal amount of duplication in our manual test scripts. Whenever I find myself having to make exactly the same update in a number of different scripts, I have to ask myself whether this is really good test design. When creating new test scripts is a matter of cutting and pasting from a test script and changing some of the data, I have to ask myself: if I don't think this is a good thing to do in code, why am I doing it here? Don't the same drawbacks apply?

Furthermore, with many tests, the devil really is in the detail. When that detail is buried in some step of a 60 step manual test procedure, and you have hundreds of these test cases for a project - how the heck is anyone going to be able to take a good clear look at that and identify any missing test cases? Essentially, this relies on the test lead for the project having everything in his or her head. While I'm a big fan of relying on the skill of your people, this strikes me as making life a lot harder than it needs to be: it's not showing your thinking, so that other people can comment and contribute.

That's the context - what's the specific problem?
So, as this is my last week in my current workplace, I have a meeting on Wednesday to discuss what we've learned from the last big project I worked on: a first attempt at bringing an area of testing into the test team that has previously been done purely by specialists in that area.

There were a number of challenges with this project, which exacerbated the issues I've noted above. For a start, generating our test data was challenging and in some cases a major pain, requiring painstaking setup over a number of days - including all this detail into the test scripts made them even more unwieldy, and later releases meant having to change some of our setup procedures. And there were also design changes, as there always are with such a large project. Updating test scripts was very time consuming. Given that we weren't knowledgeable in this new area, we also relied heavily on our specialists to review and comment on our test scripts. This was an enormous time sink, especially as our key technical expert was already overbooked and ended up doing ridiculous hours to get everything done.

And my solution?
What I'm going to propose is a hybrid approach: use James Bach's testing playbook idea to give us one source for all the myriad things important to our test design. Data models, mappings, our sketches of business scenarios, SQL queries common to a number of tests, procedural descriptions of how to setup some of our complex test data, and so on.

Essentially, this allows us to take a large amount of detail OUT of the test scripts, making them easier to read, manage, and maintain. (It also gives us an excellent resource for doing more exploratory testing - I would love to see this happen, as we've had some success in the past with using exploratory testing on various projects, but for various reasons this has never filtered through to actually change our approach in general.) It also gives us one place to update these things, one document for us to hand out to other teams, other testers for review and comment - instead of being scattered across a hundred test scripts, we can see our thinking in one place. (It's also outside our proprietary test management system, which unfortunately has a licensing agreement that means everyone outside the test team only gets to see the defect tracker. This caused additional work when we were asking people outside the test team to comment on our tests, so avoiding that is a good thing in my opinion.)

And over to you
So, now I've shown my thinking - does it make sense? I won't get to see if this works - I'm moving on, but I hope it'll prove useful to my colleagues as something to trial, at any rate (those who I've discussed it with so far sound receptive). Has anyone done anything similar?

Views: 260

Tags: manual testing, scripted testing, test design, testing playbook

Add a Comment

You need to be a member of Software Testing Club - An Online Software Testing Community to add comments!

Join Software Testing Club - An Online Software Testing Community

Comment by Anna Baik on July 26, 2010 at 8:39
I'm in a new job location now :) Still trying to figure out what might be the best strategy here, but I suspect a testing playbook may come into it somewhere...
Comment by Canadian Jill on July 19, 2010 at 0:37
Hi there! I'm new to the site ... are you in a new job location now, or did you take a bit of time off?
Comment by Lisa Crispin on July 8, 2010 at 18:03
That seems like a good thing to try. I like Markus' approach too. Maybe you can try them both?

Are the programmers automating regression tests at the unit level? If not, do you spend a lot of time finding bugs that should have been caught at the unit level? This drags down a lot of teams.
Comment by Peter L on July 6, 2010 at 15:55
@Rosie I kind of agree with Anna, what I have referred to as 'playbook' is just my notes.
You could potentially have an online whiteboard for crowd projects where people scribble their notes, drawings, doodles :) etc and aspects of that could spark ideas for people to use on future projects, or they could be 'snapshotted' for referral by testers..Could we not video the 'Crowd' when they test..could be fun? :o) Actions and a thousand words and all that...
Comment by Anna Baik on July 6, 2010 at 13:52
Hmm - what I was envisaging was very project and application specific. And also, a reasonable amount of work to put together - it's the sort of work we were doing anyway (but failing to record anywhere central in a coherent and concise format) while spending months writing procedural test scripts. I don't know how other people might be using it, or planning to. I'd love to see a blog from James Bach about how he's used it.

How would that fit with The Crowd projects? I get the impression that they're usually short, last minute projects where documentation isn't a priority. So I'm guessing you'd want something more generic, rather than focused on a particular app or project?

Do you mean something like Elizabeth Hendrickson's Test Heuristics cheat sheet ? I guess something like that, but targeted at a particular area - say web testing, might be pretty useful for sparking off ideas for tests.

STC TEAM
Comment by Rosie Sherry on July 6, 2010 at 11:50
Would it be possible to create an online playbook? I'm interested in having something that people could add to and refer to. With reference to some of The Crowd testing I think it would be very useful to point a tester to a specific section within a 'play book' to help them come with ideas on how to test. It could turn into a significant and useful resource...thoughts?
Comment by Anna Baik on July 2, 2010 at 20:31
Chad - I really like the idea of breaking out the same steps into a sub-script/function, and can think of a number of projects where it would have been a very effective approach. Absolutely agree on your other comments about how scripts are. At worst, they can end up as lists of instructions where the purpose of the test actually kinda gets lost somewhere in all the detail. Do this,do that, do the other - and 65 steps later I'm still not entirely sure why I'm doing any of it and starting to suspect that the author just forgot to put the point in there.

As far as putting the testing playbook into practice is concerned - that'll be for my colleagues to work out, as today was my last day. We had a really good discussion earlier this week about what we could learn from the last project, and everyone seemed to agree that this could be a helpful approach - so I'm hopeful that they'll be able to give it a go. I'm not moving away from the area (or at least not far) so I'm sure I'll get to catch up with folks for a beer at some point and talk shop.
Comment by Chad Patrick on July 1, 2010 at 19:18
Have you considered making your scripts more efficient? If you're repeating the same steps over and over have you considered breaking that out into a sub-script (think function). That function uses parameters and which parameters you provide can be detailed in a data matrix. I think you're absolutely correct in relating a test script to coding.

My biggest complaint in most script-heavy environments is in how the scripts are. They're simply a long list of instructions that are not easily maintained, do not adapt to newly developed test cases nor do they provide flexibility.

I'm interested to see the fruit of your effort or at least hear how it works out for you.
Comment by Anna Baik on June 29, 2010 at 20:26
Hi Peter,

I'd love to see your notes when you publish them! It does sound like it, as I understood the idea anyway. I'm hoping that colleagues will let me know if they do go ahead with it, but of course it's not the same as being there and in the thick of it.

I'm not sure I want to guess at the moment at how I might use a testing playbook in the future - as my new workplace is going to be so different to my current one I'm trying to keep my mind as open as poss right now.
Comment by Peter L on June 29, 2010 at 13:51
Hi Anna,
Good post. Interesting ideas, I have been in a similar situation. I would be interesting in whether you plan to use a testers playbook in the future and what it would look like and contain. Perhaps one of your colleagues can tell us how they got on. I am about to publish stuff that has been in one of my notebooks, is it a playbook? or just notes? not sure..but it was notes, diagrams, and lists that helped me get a grasp of things and helped me work things out.

Good luck with your next role!

Peter

Adverts

Ministry of Testing

© 2014   Created by Rosie Sherry.

Badges  |  Report an Issue  |  Terms of Service