On Reddit, diminoten asked:
The company I work for is looking to make changes in how our software development cycle works (we currently use a waterfall-type method), and the higher-ups have decided to start moving towards a more Agile development cycle, with the appropriate, "sprints" and whatnot.
I've read a bunch about Agile development, and it's gotten mixed reviews. Has anyone experienced it, from a QA perspective? What should I expect from this? Am I out of a job? :-(
Agile still sparks a lot of concern and questions among QA folks, so I thought I'd share my reply more widely. Here it is:
QA in Scrum
There are a lot of different forms of Agile. The most popular is Scrum, which involves (more or less) dividing work into 2-4 week chunks called Sprints. At the end of every Sprint, the team is supposed to produce software that is ready to ship, which means that it has to be fully tested and fixed.
QA in this environment is like QA in non-Agile software development, except that you ship more often. If you rely on a lot of manual regression testing, the regression testing burden increases with program size and eventually becomes unsustainable. Even with automated testing, your maintenance burden tends to grow, which eventually makes it difficult to create software that's been fully tested every Sprint.
QA in Extreme Programming
Less common than Scrum, but probably second-most popular, is Extreme Programming (XP). You could say that XP takes Scrum's organizational practices and adds technical practices. XP teams work in 1-3 week chunks called iterations and are also supposed to produce software that's ready to ship every time. XP also emphasizes close collaboration, cross-functional teams, and working in shared workspaces.
The QA role is a lot different in XP because the nature of testing changes in XP. Let me explain.
The problem with working in short cycles, as Scrum and XP do, is that design and testing often get shortchanged. (Well, that's true in any process, isn't it? But it's exacerbated when working in short cycles.) XP addresses this with specific technical practices that revolve around doing design and testing at the same time as coding. The iconic activity is test-driven development (TDD), and it's done while pair programming. This involves two programmers working together at the same computer to produce a bit of test code, a bit of production code, a few design improvements, then repeat. The pair repeats the cycle every few minutes.
The result of TDD is a rich set--thousands--of automated regression tests. As a result, QA changes from primarily creating and running test cases to helping the team prevent defects in the first place. Some QA folks prefer work closely with the business experts on the team (we call them on-site customers) to help them provide clear examples of what they want. Some enjoy writing automated non-functional tests, such as performance and stability tests. Some do exploratory testing, which is a technique for finding surprising defects. In my book I recommend using defects discovered in exploratory testing as an opportunity to apply root-cause analysis and perform process improvement.
QA in Reality
So there you have it. In Scrum, QA is largely unchanged, but more frequent. In XP, QA can be quite different. You'll work alongside programmers in a shared workspace and the programmers will create most of the test cases. You'll use your testing expertise to help clarify requirements, create non-functional tests, or identify flaws and improvements in the process.
Well, in theory. At the risk of ending on a downer, the vast majority of shops that say they're doing "Agile" do nothing of the sort. Instead, they use the terminology without actually following the underlying ideas. For example, teams will plan in "Sprints" but not actually produce shippable software every month. So don't be surprised if nothing really changes except the terminology. Real change of the sort Agile needs requires willpower and commitment on the part of both managers and team members. That's easier to fake than supply.