The last few days I've interviewed some candidates for an entry-level QA position at the company where I work . All of them were fresh graduates of local universities, some from the University of California system, some from the Cal State system. All of them had something in common though: they very seriously explained to me how they took classes in "Software Development Lifecycle" and how they worked on toy projects, first obtaining requirements, then designing, then implementing, then at the very end, if they had time, doing some manual testing. Of course, no word of automated testing, iterations, or other agile concepts. At least they used source control (CVS).
One guy in particular told me kind of proudly that he knows all about the Waterfall methodology. He said they spent a lot of time writing design documents, and since they *only* had one semester for the whole project, they almost didn't get to code at all. I couldn't help laughing at that point, and I told him that maybe that should have been a red flag concerning the validity of the Waterfall methodology. I couldn't have found a better counter-example to Waterfall myself if I tried. Almost no code at all to show at the end of the semester, but they razed half of a forest with their so-called documentation!
Of course, I tried to gently push them on the path to enlightenment, briefly explaining that there are other ways to approach software development and testing. One can always try at least, right?
Another thing that irked me was that, since they knew this is an interview for a QA position, some of the candidates thought it necessary to tell me they're busily learning WinRunner. I told them in a nice and gentle manner that I don't give a *beeeeep* about WinRunner, and that there are many Open Source tools they can leverage. One of them said that yes, that may be true, but still many companies require a knowledge of WinRunner for QA positions, so you just *need* to put it on your resume. Sigh. The battles one has to fight...
If I'm happy about one thing from this whole experience, it's that all those people left with their vocabulary enriched with a few choice words: Open Source, Python, scripting, automated testing, continuous integration, buildbot, etc. You never know what grows out of even tiny seeds you plant...
It amazes me though how out of touch many schools are with the realities of software development. If I were in charge of the CS program at a university, I'd make it a requisite for all students to work through Greg Wilson's Software Carpentry lectures at the University of Toronto. I particularly like the lectures on The Development Process and Teamware. Unfortunately, stuff like this seems to be the exception rather than the norm.
Friday, March 17, 2006
They still teach Waterfall in schools
Subscribe to: Post Comments (Atom)
Modifying EC2 security groups via AWS Lambda functions
One task that comes up again and again is adding, removing or updating source CIDR blocks in various security groups in an EC2 infrastructur...
This post is a continuation of my previous one on " Running Gatling tests in Docker containers via Jenkins ". As I continued to se...
For the last month or so I've been experimenting with Rancher as the orchestration layer for Docker-based deployments. I've been pr...
Here's a good interview question for a tester: how do you define performance/load/stress testing? Many times people use these terms inte...
Thankyou very much for the few but most effective words. I consider myself a budding up tester and trying to make myself familiar with QA. If I had missed reading your blog..probably my answer would have been close to what Univ students answered.Thankyou very much and any suggestions for me would be deeply appreciated.
Thank you once again,
As I said in the post, I'd start by reading the Software Carpentry lectures, and take it from there. You'll also find a lot of good stuff on agile testing on this blog (all modesty aside :-)
Other recommended sites:
- Brian Marrick's testing.com site
Generally speaking, a google search on agile testing should bring up lot s of good stuff.
You might be interested in this 'conference' on Waterfall:
Grig - being in a position where I too have had to interview people for QA positions, I echo your sentiments, and experiences.
It's odd, if you interview "old school" QA people, you see the same type of worldview in many cases as those you see coming straight out of school in regards to development methodology.
It's a rare moment when you find someone (either from the old crowd, or the young crowd) that really "breaks the mold" when it comes to QA and Development methodology.
I've had a hard time finding impressive candidates who aren't stuck in the "traditional" (or as I sometimes call it, the "big company") rut. It's even harder to find QA people well versed in Python, and harder still to find QA people who not only know QA, but also how to really program applications/tests.
Does "waterfall" stand for the rushing sound the requirements make as they pass you by, or does it refer to the Chinese water torture that users of the resulting software experience?
It's gotta be one of those, right?
Too bad I'm looking for a python job in Toronto and not California.
Everytime I get asked to describe the waterfall model in an interview I ask: "Come on, you seriously don't use the waterfall model, do you?"
Then I tell them about Royce's paper and how the waterfall model was created as a example to show why big design up front doesn't work.
I don't think it helps though.... When will I learn to keep my mouth shut?
And just for reference, they still teach the waterfall model at the University of Waterloo (or at least they did when I was in first year)
Anonymous -- Toronto is not a bad place to be when looking for a Python job. Get in touch with Greg Wilson :-)
Thanks for the tip!
Excellent post. I've recently been through a spate of interviews and noticed the same things.
On a side note, it appears the rss feed to your site is busted. I'd sure appreciate a fix -- http://agiletesting.blogspot.com/atom.xml
Alex, thanks for the heads-up concerning the atom.xml file. I fixed it.
Teaching Waterfall in schools is still very important. _ALOT_ of companies still use it as the core development methodology; or at least use one of it's cousins (spiral for instance is just lots of smaller waterfalls). I know we (HP) use a variation of the waterfall complete with 'phase exit' critereon. It could be that I expect different things from people exiting university. I expect them to have a base of knowledge which can be used to build on. The waterfall is good for that. It teaches how to plan, and code. Entry level positions don't need to worry about shipping actual projects. Thats what the more experienced people need to deal with and drag the new coder along (kicking and screaming if necessary). I think also that the Agile Kool-Aid goes down better with a base of knowledge.
Of course, I could be jaded as I have yet to find a flavor of the Agile Kool-Aid that I like.
As for WR, if a candidate has it, cool; but I don't care. I care more about how they think. If I've screened appropriately anything else is 'just a technology' and they should be sharp enough to pick it up. The market perception is there however that you need WR. I taught and 'intro to qa' at a local tech college and later proposed a 'test scripting with python' course but there was no interest from students. The WR and LR courses are packed all the time though.
(Oh, and re python in Toronto, sign up for DemoCamp4 and schmooze like heck)
Sorry to hear you haven't found an "Agile Kool-Aid" flavor that you liked so far....I can't say I've been doing agile development at my work (although I hope that's about to change, as some of us are pushing towards an adoption of Scrum), but I have been applying agile techniques and tools for some projects I've worked on my own, most notably for the Agile Testing tutorial I presented together with Titus Brown at PyCon06. So I can say from experience that 'agile' really works.
I don't agree with you when you say that waterfall offers a solid base of knowledge to build on. For example, the planning you learn to do in waterfall doesn't help you one bit in meeting deadlines in the real world, where everything keeps slip-sliding away. I'd much rather expose students to agile planning techniques such as user stories (see Mike Cohn's excellent book "User stories applied"). And the same goes for testing. Waterfall teaches developers to code in a void, then throw the code over the wall to QA. You and I know this is not a good approach.
And what's even more important than tools and techniques is the human, social aspect of agile development, where people work as a true team that is much more than the sum of its parts. You can't really capture that with dry methodologies, but you can try to teach it, as Greg Wilson does in his Teamware chapter.
As for WinRunner/LoadRunner, I have a very poor opinion of them. Expensive, bulky, restrictive, locking you in into their world. If it sounds like Microsoft, it's because it is. Unfortunately, nobody got fired (yet) for buying Microsoft, and nobody gets fired (yet) for introducing WR/LR into their organization. But to me, that organization is a place I wouldn't like to work.
I warmly recommend the agile-testing yahoo group as a place where you can share opinions and see what other people think about all these things.
Some 5-7 years ago I've been also taught a very concrete and Waterfall-like discipline of building the enterprise scale information systems (IS). It was not too bad, because some types of IS do have similarities and in this case rigorously applied waterfall is able to deliver more or less predictable products. What was bad was no notion about the existence of another approaches. Just as you tell we've been taught just a "Software Development Lifecycle". Nobody told us that there are different ways and views on constructing the system.
Alistair Cockburn is working with somebody at a local state college in Utah to create a software engineering course that teaches more agile concepts. I was very jealous, when I heard about it, wishing I had taken such a course in school. They are going to even do the projects iteratively, not just one big one due at the end of the semester.
You will be happy to know that down here in Australia there is the same problem. But you may also be happy to know that I have had excellent experience in running student projects using Agile methods (perhaps see my article at http://isedj.org/4/103/index.html).
I have also spent 4 months teaching Agile Development Methods to students at a Thai university (Naresuan University http://www.nu.ac.th/english/ ... the first time Agile methods have been taught at any Thai university, I believe.
Yes, it is an enormous problem that university academics seem intent on teaching a development approach first published in the late 70's. AND they always seem to confuse the term SDLC ... it was first Structured Development Life Cycle ... NOT System (or Software) Development Life Cycle. SDLC was effectively highjacked to imply that it was the only SYSTEM Development method, rather than just being one system development method ... called STRUCTURED Development Life Cycle. It seems a bit like the misinterpretation of Royce's paper, wrongly used to support something else.
I have even been prohibited from teaching Agile development Methods by a fundamentally ignroant but arrogant academic manager, on the grounds that it was not 'authorised curriculum'. So much for academic freedom in the IS/IT educational environment.
This commentary is flawed. I would think equally as bad about a candidate if they came into my office and based all answers and understanding on agile. I have found no cookie-cutter approach to QA. Every company has specifics that will not allow you to use a model out of the box.
We had a guy come in one time that did nothing but rave about Agile. That was fine with me, but what I found was that the candidate had no real understanding of QA processes.
I am back on the Deck :)
Last 1 yr of Job experince taught me many new methodologies keeping aside the "WF" model. Haven't I worked with a company enforcing agile testing I wouldn't have bumped thru this testing theme called " V- model"...Is it worth calling it a methodology supporting Agile testing?
All inputs appreciated.......
I will admit up front that I am biased towards WR (and for those with your head in the sand there is QTP) & LR as I have worked with it and sold it for the past 10 years.
Schools and businesses look for WR/QTP experience for two reasons. It has stood the test of time like C has. When I interview people, I don't send them packing just because they have C and not Python. This is a foundaton that can be built on. If you know one language, you can learn another, I did as Pascal was my first language learned after high school but I spent 5 years after college doing C programming.
Open Source is good and there are some nice tools out there that work. The down side to them is you need several different open source tools to cover the breadth of what QTP or Rational's tool can do for testing (web, java, vb, delphi, terminal emulation, etc) In my 10 years, I have never encountered an open source that could test as much as I could with QTP.
BTW, the open source tools are no better equipped to support agile testing than QTP is. There is the perception open source are better since most developers are comfortable to code for the open source tools to work and look at QTP's record capabilities and figure there is no way that it can go to the depths of the open source. Keep in mind, there is a whole language behind the QTP UI that will let you code your heart away. It is most common that I will perform the basic simple recordings for smoke test to keep up with code the programmers are pumping out quickly and then script by hand the areas that don't have a UI to record on. Plus, with HP Software's BPT, non developers & non-QA folk can put together basic tests that can be automated later which work ideal with agile testing methodologies. I haven't seen any open source tool offer any thing close to that.
As I said, I am bias toward the tools up to the challenge to take on anyone who really wants to pit an open source tool against QTP/WR/BPT's capabilities.
Great read by the way.
I really got a kick out of your post!
Many universities have been failing to produce good (knowledgeable) developers for some time now.
There are others (well, other) such as www.neumont.edu where the students become familiar with good and bad (waterfall/agile/xp) development methodologies during the first quarter. Not to mention they begin learning to work in Teams right off the bat as well.
I'll proudly admit to being bias as I am a recent graduate :)
Also, in regard to actually producing software, I recall many times when a fellow class mate would tell me about a friend of theirs at a traditional university working on a similar project (as my friend) with an entire semester to complete it but was unable to do so while the project at www.neumont.edu was completed in 9 weeks (neumont is quarter based) and most times extra features are added on!
I apologize for this turning into a shameless plug :]
i hate to necro this post, but it _IS_ so frustrating. We still have a QA 'department' that wants to play 'gate keeper' on software but refuses to 'participate' until after all the documentation and development is done. Then they cry because they get pointed at as an impediment or bottleneck. It's 'best practice' they say to have the requirements up front and a test plan up front and software to test on before writing any test cases. *sigh*
It's very tiring :)
deep down I admire & respect them but hope they 'get it' some day.
Post a Comment