I was invited by Paul Moore and Paul Hodgetts to give a presentation at the Agile/XPSoCal monthly evening meeting, which happened last night in Irvine, at the Capital Group offices. The topic of my presentation was 'How to Get to "Done" - Agile and Automated Testing Techniques and Tools'. I think it went pretty well, there were 30+ people in attendance and I got a lot of questions at the end, which is always a good sign. Here are my slides in PDF format. I presented a lot of tools as live demos outside of the slides, but I hope that the points I made in the slides will still be useful to some people.
In particular, I want to present here what I claim to be...
The Second Law of Automated Testing
"If you ship versioned software, you need automated tests."
At the talk last night I was waiting to be asked about the first law of automated testing, but nobody ventured to ask that question ;-) (for the record, my answer would have been 'you need to buy me a beer to find that out').
But I strongly believe that if you have software that SHIPS and that is VERSIONED, then you need automated tests for it. Why? Because how would you know otherwise that version 1.4 didn't break things horribly compared to version 1.3? You either employ an army of testers to manually test each and every 1.3 feature that is present in 1.4, or you use a strong suite of automated regression tests that cover all major features of 1.3 and that show you right away if any were broken in 1.4. Your choice.
Notice that I also qualify the software as 'software that ships'. This implies that you hopefully use sound software engineering processes and techniques to build that software. I am not referring to toy projects, or 1-page Web sites for temporary events, or even academic projects that are never shipped widely to users. All these can probably survive with no automated tests.
If you think you have some software that ships and is versioned, but you found that you're doing very well with no automated tests, I'd like to hear about it, so please leave a comment.
Thursday, May 21, 2009
The Second Law of Automated Testing
Subscribe to: Post Comments (Atom)
Modifying EC2 security groups via AWS Lambda functions
One task that comes up again and again is adding, removing or updating source CIDR blocks in various security groups in an EC2 infrastructur...
This post is a continuation of my previous one on " Running Gatling tests in Docker containers via Jenkins ". As I continued to se...
For the last month or so I've been experimenting with Rancher as the orchestration layer for Docker-based deployments. I've been pr...
Here's a good interview question for a tester: how do you define performance/load/stress testing? Many times people use these terms inte...
Very cogent advice. As most budgets these days are getting thinner and thinner, spending money on repeatable automated testing makes a lot of sense.
Thanks for the reinforcement!
Good Stuff Grig! (as usual)
Since "You should always develop using a software versioning tool", your rule becomes "You should use automated testing".
Nice slides, I'm glad you got a good reception from the talk.
One nit to pick: Your slide on "Performance/load/stress testing" claims that performance tests are about "Eliminating bottlenecks".
That's not quite true; testing isn't about *changing* anything, it's about *discovering* what's true. So that might be better expressed as "Revealing bottlenecks".
This is a good point. I think by "versioned" you mean, "changes over time", or "has more than one release". I make the point only because in web applications, often there are no explicit version numbers. New code gets pushed frequently, often more than once a day. What is flickr's version number, for example?
Ironically, it is in these environments that automated regression testing is both more important (because of the frequency of releases) and more difficult (because of using cutting-edge browser techniques, browser incompatibilities, and so on).
BTW: thanks for the mention in the slides! :)
Bignose -- you're right, "revealing bottlenecks" is a better description of performance testing.
Ned -- I was referring to internal version numbers that I think any respectable Web application would have. As an end user, you don't see them, but internally I'm sure most Web apps deploy based on code from a certain branch, and even have internal code names for their release.
Nice slides, but I'm still wondering what the first law is. Or is it that you have to get a beer in the bar before starting to automate?
Arjan -- OK, I'll spill the beans, but you'll have to buy me a beer at some point ;-)
I think the First Law of Automated Testing is: JUST DO IT! Start small, and add more and more tests. Pretty soon you'll become test-infected and hopefully your enthusiasm will be shared by other people in your organization.
Ha, that's probably true. That's how I did it in at least 2 jobs, but just because I didn't want to go into the hassle of convincing managers that wanted a Business Plan first.
I just started with some scripts to reduce repetitiveness and make my life easier. Then glued individual scripts together and trying to convince others to use the same tools. By that time, my efforts were highly appreciated, and the tools are still used.
It is a pitty though that lot's of people start like that. So recently I have tried to re-design my tools such that they can be more universally applied. A first step is described at:
You'll get that beer if I run into you ;-)
By the way, one related best practice that I use:
If you have a version control system for your software, use it for your automated test suite as well.
i'm new to the whole idea of automated testing but my experience with versioned software and the constraints of manual tests (no time, no tester, no money, error-prone) taught me a lesson: i need, want and demand automated tests. first researches had my boss aboard and now i am in the lucky position to get to write my dissertation on automated testing.
my google-voodoo led me to your blog where i found some interesting information and links - thanks for that! :)
I really agree with your point in "versioned" and the way you explained is really good and its so clear and easy to understood the key point in Automating. I got the Automation software to do this process from www.macrotesting.com And it really sounds better to do automation to find that. I really have to cheer you as your post made really sense and its a good advice too.
I would like to know what automated tools have worked well in an Agile environment, of course with a limited budget. Any ideas will be helpful.
Dave -- I have a couple of slides at the end pointing to various tools. I've used all of them successfully in agile projects.
Post a Comment