Tuesday, May 12, 2009

Book Review: Implementing Automated Software Testing

Implementing Automated Software Testing by Elfriede Dustin, Thom Carrett, Bernie Gauf. Addison-Wesley. ISBN 0321580516.

This book isn’t for everyone, but everyone can get some value out of it. What I mean by that rather confusing statement is that folks working in Agile environments will likely want to throw the book across the room while folks in more bureaucratic environments like CMMI or other waterfall environments will likely get a great deal of value from the book.

I’m an Agile fanatic and I had a difficult time dealing with book’s approach which emphasizes spending large amounts of time creating documentation such as requirements traceability matrixes, detailed test plans, etc. My preferred approach is to have testers working side-by-side as part of a team, creating specifications from user stories/requirements and moving those right in to automated test suites via tools like Selenium, Cucumber, or RSpec.

That said, I did indeed get some good value from the book. I found the discussions on making hard evaluations on what to test very worthwhile reading: teams can easily vaporize large amounts of time creating large suites of brittle, unmaintainable automated tests. This book has several really good chapters on using business cases to drive return on investment (ROI) decisions for testing, understanding automated test pitfalls, and adjusting your testing as you progress through your project.

Additionally, one of the book’s high points was on building the test team: “Put the Right People on the Project – Know the Skill Sets Required.” This is a great chapter which emphasizes starting the search by focusing on how to interview test team members – and how those testers’ skills are greatly different than other members of the team.

The book’s very academic, dry tone makes for some difficult reading, and few concrete examples are used until very late in the book. Having spent a large number of years either in the DOD or working for DOD contractors, it quickly became apparent that much of the book seemed targeted to folks working in those environments – too many dry acronyms are scattered through the book, adding to the difficulty in reading.

The lack of examples using real tools frustrated me. While the appendices contain some examples of comparing various tools, the book doesn’t actually show how a real world testing environment would use those tools. One appendix, eight or nine pages in length, is touted as a “Case Study” but falls short, in my opinion.

Overall it’s a decent book. The dry tone and lack of real environments is balanced out by the excellent coverage of team skills and emphasis on selecting how and what you test.

1 comment:

Mr. Hericus said...

Hi Jim,

Thanks for the overview. Based on your review, I think I'll pass on the book. It's good to know that there are a few details worth looking into, though.

Thanks!

Subscribe (RSS)

The Leadership Journey