Peer Reviews and Measuring Software Quality

At Zenkara, we use pretty straight forward definitions of:

  • Walkthrough: an informal page by page or section by section presentation of a document, piece of code, test script, etc by the lead author – and there’s usually lots of discussion
  • Peer Review: section by section review of a document, etc looking for defects but also with lots of lively discussion about potential solutions
  • Inspection: formal review looking for defects.  Potential solutions are noted but followed up after the meeting

Unfortunately these reviews are often hated with a vengeance by technical staff.  It’s human nature to dislike these activities, but the degree of cooperation by staff is often determined by the nature of the reviews, and the culture/behavior of the organisation.  If they’re viewed in a positive product-oriented light they can be useful in finding defects and generating good discussion and consistent vision of the project.  It’s critical to focus on the product and not the author – otherwise it gets personal and that achieves nothing except to destroy morale.  That’s why it’s important to get or train effective facilitators/moderators.

It’s also important to cultivate basic metrics as they’re a window to the soul of the development team – and vicariously the product itself.

But which metrics are useful?  This of course depends on the organization and the style of development (more vs less structure, process, etc) and product type.

A set you could use is here.  You can then collect the data in a spreadsheet or database.  Collecting this data gives everyone a better understanding of what’s happening.  Among other things it shows which defects occur regularly so you can do something about it.

To improve the quality of the reviews, you can use a couple of simple checklists:

  • management plan checklist
  • specification checklist
  • coding checklist
  • testing checklist

And how to we make sure that they’re useful and are used?   By adding typical defects found in earlier reviews.

Advertisements
This entry was posted in defects, improvement, metrics, peer reviews, people, software quality and tagged , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s