Sometimes known as the informal peer group review, walkthroughs are one of a handful of techniques that make a big difference to the chances of success in a software project. (The others are CRC, design patterns, having an effective architect(ure team), money, time, good karma, luck, …). It is also a lightweight technique, if done properly. A large return on a small investment.
A walkthrough is different to a formal review or formal inspection. A project might or might need to or might not have formal review in addition to informal review. The idea of the walkthrough is fairly old but aspects of Kent Beck’s recent extreme programming have similarities.
A walkthrough is requested by the producer of an artefact (e.g. a class interface, a method, an architecture feature, …), it is not imposed. The aim of a walkthrough is to improve the quality of a piece of work by discovering potential problems. A walkthrough, when done properly, is seen as a positive contribution to the producer and his or her work; it is not seen as criticism or a negative activity or a threat.
When they work they are very good. When they don’t work they can be a big waste of time. So how do we make them work?
The producer and the reviewers are the minimum.
A scribe can be added with the specific job of taking good notes to help the producer’s recollection and understanding of the comments.
A moderator can be added. A moderator might be necessary for a group that hasn’t “gelled”, a group with socio-political challenges. In other words a moderator might be necessary for 80% of groups. The moderator should be experienced in walkthroughs but doesn’t necessarily have to expert in the subject matter or the technology. The moderator should be respected however – “I don’t need to know the technology, I’m a manager” commands no respect. Essentially the moderator has to gently ensure that the rules, described here, are followed. “I think that might be heading in the direction of recriminations, Fred, why don’t we get back to the possible memory leak”, might be a typical interjection.
DON’T make walkthroughs too big. You might not have enough personnel anyway, but in large groups with time on their hands (yes, it happens, very occasionally), 20 people might drift along to a walkthrough; and that’s too many. Five or six participants should be a typical maximum. Two or three reviewers are enough; three or four are ideal.
The management doesn’t attend. This is important. With the management in attendance, the walkthrough is inevitably seen as a partial threat rather than a positive contribution.
The responsibilities of the producer are
- to request the walkthrough in the first place,
- to circulate the minimum necessary documentation in advance – ideally the day before, but an hour before at the absolute minimum,
- to receive the comments as contributions to the quality of a piece of work,
- to ensure that (s)he understands the comments,
- thank the reviewer.
Ideally the comments should be fully understood during the session. If the producer has to seek later explanation there is a risk that it becomes “you were wrong”, rather than “I didn’t understand”.
The reviewers have the technically difficult job of locating potential flaws. They also have the socially difficult job of making positive review comments rather than giving actual or implied criticism. “I think you might be copying that object when maybe you didn’t intend to”, rather than “Well that pass-by-value is wrong” or “You idiot, don’t you know that …”.
Reviewers must also be willing to explain, but not justify or defend, their comments after the walkthrough. This could take a few minutes but, rarely one hopes, could take up to an hour.
How does it proceed
The producer will walk the reviewers through the product pointing out the important features, sketching how it works, what it does and what it consists of. Generally the producer is trying to give the reviewers enough familiarity and background that they can make sensible comments.
Short. Short. Short. Reviewers will become reluctant to review if the walkthroughs drag on. The granularity of the work reviewed is therefore very important. You must be able to get through the walkthrough in one hour.
Walkthroughs find potential problems. They don’t have to prove that the problem exists. They don’t (indeed mustn’t) try to fix the problem; they might just indicate the direction in which the solution could lie.
Don’t ask a reviewer to attend more than two consecutive walkthroughs. And two consecutive walkthroughs should be the exception rather than the rule.
Ways to ruin walkthroughs
- Too many people
- Too many reviews in a day
- Allowing them to drag on for hours
- Allowing them to degenerate into negativity, bitching, arguing, cleverness demonstrations, competitions or axe-grinding (obsession obsessing)
- Using them as a substitute for formal inspections if formal inspections are necessary as well
Apart from the gains of the producer, there are no formal results. There are walkthroughs at the most formal end of the walkthrough spectrum where a green light/amber light/red light result is suggested at the conclusion of the walkthrough, and where statistical ratios of green : amber : red might be published outside the walkthroughs, although never attributed. But in my experience this isn’t usually very helpful.