Sunday, June 26, 2011

Scrum Metrics

There are lots of metrics that could be collected to assess a software development team's competency, success, inventiveness, quality, and quantity of work. To get an idea of what metrics one could collect, take a look at the 185 practices in CMMI for Development. Agile, unlike CMMI, doesn't require "evidence" that engineering practices are being followed and therefore has few metrics that a Scrum Team may collect to measure the success of each sprint. Below are 9 metrics that a Scrum Team might consider using.
  1. Actual Stories Completed vs. Committed Stories
  2. Technical Debt Management
  3. Team Velocity
  4. Quality Delivered to Customer
  5. Team Enthusiasm
  6. Retrospective Process Improvement
  7. Communication
  8. Scrum Team's Adherence to Scrum Rules & Engineering Practices
  9. Development Team's Understanding of Sprint Scope and Goal
To answer the question of who should be collecting metrics and measuring the Scrum Team's success, consider who in Scrum is responsible for the team's success. In describing the role of ScrumMaster, the Scrum Guide states, "The ScrumMaster teaches the Scrum Team by coaching and by leading it to be more productive and produce higher quality products." Clearly it's the responsibility of the ScrumMaster to measure the success of the team if only to increase the team's productivity and product quality. The ScrumMaster could use a spider chart as shown below to track the Scrum Team.

Using a spider chart is an easy way for the ScrumMaster to track and compare results from sprint to sprint.














Below are short descriptions of each metric, how it can be measured, and some of the issues that could lead to a low score. Any additional metrics or comments on these 9 would be most welcomed.

1.  Actual Stories Completed vs. Committed Stories
This metric is used to measure the development team's ability to know and understand their capabilities. The measure is taken by comparing the number of stories committed to in sprint planning and the number of stories identified in the sprint review as completed. A low score may indicate any of the following may need special attention:
  • Team does not have a reference story to make relative estimates (see Team Velocity),
  • Not every Team member understands the reference story (see Team Velocity),
  • Product Owner isn't providing enough information to the development team (see Communication, Development Team's Understanding of Sprint Scope and Goal),
  • Requirements scope creep (see Development Team's Understanding of Sprint Scope and Goal),
  • Team is being disrupted (see Scrum Team's Adherence to Scrum Rules & Engineering Practices, Team Enthusiasm).
Even if the Team completes the stories they committed to, there are a few things the ScrumMaster should be looking for:
  • Team under-commits and works at a slower than 'normal' pace (see Team Velocity),
  • Team has one or more 'hero' (see Team Enthusiasm),
  • Story commitment is met but the product is 'buggy' (see Technical Debt Management, Quality Delivered to Customer).

2.  Technical Debt Management
This metric measures the team's overall technical indebtedness; known problems and issues being delivered at the end of the sprint. This is usually counted using bugs but could also be deliverables such as training material, user documentation, delivery media, and others. A low score may indicate any of the following may need special attention:
  • Customer is not considered during sprint (see Quality Delivered to Customer),
  • Weak or no 'Definition of Done' which includes zero introduced bugs (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Management interfering with Team-forcing delivery before the Team is ready (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Team working multiple stories such that team compromises quality to complete stories as end of sprint nears (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Team is creating bugs that reflect their opinion on how things should work rather than listening to the Customer (see Scrum Team's Adherence to Scrum Rules & Engineering Practices, Quality Delivered to Customer).
Even if technical debt didn't increase, here are a few things the ScrumMaster should be looking for:
  • Team is not documenting problems found or fixed (see Communication, Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Team is not engaging with Customer/Product Owner during user acceptance testing (see Communication, Scrum Team's Adherence to Scrum Rules & Engineering Practices)

3.  Team Velocity
The velocity metric measures the consistency of the team's estimates from sprint to sprint. Feature story estimates are made relative to estimates of other feature stories, usually using story points. The measure is made by comparing story points completed in this sprint with points completed in the previous sprint, +/- 10% (the Nokia Test uses +/- 10%). A low score may indicate any of the following may need special attention:
  • Product Owner isn't providing enough information to the development team (see Communication, Development Team's Understanding of Sprint Scope and Goal),
  • Team size is changing between sprints (generally, the core team must be consistent; allowances should be made for absences e.g. vacation, sick),
  • Team doesn't understand the scope of work at the start of the sprint (see Development Team's Understanding of Sprint Scope and Goal, Communication, Actual Stories Completed vs. Committed Stories),
  • Team is being disrupted (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Team's reference feature stories are not applicable to the current release  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Team is doing very short release cycles (< 3 sprints) or doing maintenance work (the Team might consider Kanban or XP over Scrum under these circumstances).
Even if the Team seems to have a consistent velocity, there are a few things the ScrumMaster should be looking for:
  • Team under-commits and works at a slower than 'normal' pace (see Actual Stories Completed vs. Committed Stories),
  • Team has one or more 'hero' (see Team Enthusiasm),
  • Velocity is consistent but the product is 'buggy' (see Technical Debt Management, Quality Delivered to Customer).

4.  Quality Delivered to Customer
In most companies, delivering a quality product to the customer and keeping the customer happy is your only reason for being. Scrum attempts to have the outcome of every sprint provide value to the customer i.e. a 'potentially releasable piece of the product'. This is not necessarily a product that is released but a product that can be shown to the customers as a work in progress, to solicit the customers' comments, opinions, and suggestions i.e. are we building the product the customer needs. This can be best measured by surveying the customers and stakeholders. Below shows a spider chart survey of customers made after the Sprint Review.

The ScrumMaster can document all customer or stakeholder opinions using an easy to read spider chart.


















A low score may indicate any of the following may need special attention:
  • Product Owner doesn't really understand what the customer wants and needs (see Communication),
  • The Product Owner is not adequately communicating the customer needs to the development team (see Communication, Development team's Understanding of Sprint
  • Scope and Goal),
  • Customers are not involved in the development of stories (see Communication),
  • Customer is not involved with defining story acceptance criteria (see Communication)
  • Bugs are delivered with the product (see Technical Debt Management).
Even if the customer is satisfied with the sprint product, there are a couple things the ScrumMaster should be looking for:
  • The product is 'buggy' (see Technical Debt Management),
  • Not all customers and stakeholders are invited, show up, or participate in the sprint review (see Scrum Team Adherence to Scrum Rules & Engineering Practices, Communication).

5.  Team Enthusiasm

Enthusiasm is a major component for a successful Scrum Team. If they don’t have it, no process or methodology is going to help. There are many early warnings signs that left unchecked, can lead to loss of 'enthusiasm'. It's up to the ScrumMaster to recognize these symptoms and take appropriate action for each particular circumstance if the ScrumMaster is to keep the Team productive and happy. Measuring team enthusiasm can be very subjective and is best accomplished through observations during the various sprint meetings; planning, review and daily stand-up. However, the simplest approach is to ask the direct question of the Scrum Team, "Do you feel happy?" and "How motivated do you feel?" A low score may indicate any of the following may need special attention:
  • ScrumMaster is taking too long to remove impediments,
  • The number of impediments during the sprint was high,
  • The team is being interrupted by managers or the Product Owner (see Actual Stories Completed vs. Committed Stories),
  • Some team members can't contribute in certain product areas i.e. lack of cross-functional training,
  • Team members are working long hours i.e. not working at a sustainable pace (see Actual Stories Completed vs. Committed Stories),
  • Internal conflicts between Product Owner and Team,
  • Internal conflicts between Team members,
  • Team is repeating same mistakes (see Retrospective Process Improvement),
  • Team is continually grumbling, complaining, or bickering,
  • Team member(s) don't have passion for their work,
  • The Team is not being creative or innovative.

6.  Retrospective Process Improvement

The Retrospective Process Improvement metric measures the Scrum Team's ability to revise, within the Scrum process framework and practices, its development process to make it more effective and enjoyable for the next sprint. From the Scrum Guide, "The purpose of the [Sprint] Retrospective is to inspect how the last Sprint went in regards to people, relationships, process and tools. The inspection should identify and prioritize the major items that went well and those items that-if done differently-could make things even better. These include Scrum Team composition, meeting arrangements, tools, definition of 'done,' methods of communication, and processes for turning Product Backlog items into something 'done.'" This can be measured using the count of retrospective items identified, the number of retrospective items the Team had committed to addressing in the sprint, and the number of items worked/resolved by the end of the sprint. A low score may indicate any of the following:
  • Team doesn't identify any items for improvement due to feeling there's nothing they can do or it's out of their control i.e. Team not self-organizing and managing,
  • During sprint, Product Owner, ScrumMaster, or management discourages work on self-improvement at the expense of feature stories,
  • During the sprint, the Team discourages work on self-improvement at the expense of feature stories,
  • Team doesn't look inward at their own performance and environment during the retrospective,
  • Team is not acknowledging or addressing repeated mistakes (see Team Enthusiasm).

7.  Communication

The Communication metric is a subjective measure of how well the Team, Product Owner, ScrumMaster, Customers, and stakeholders are conducting open and honest communications. The ScrumMaster, while observing and listening to the Team, Product Owner, Customers, and other stakeholders throughout the sprint, will get indications and clues as to how well everyone is communicating. A low score may indicate any of the following:
  • The Customer is not actively involved with feature story development (see Quality Delivered  to Customer),
  • The Customer is not providing acceptance criteria for stories (see Quality Delivered  to Customer, Technical Debt Management),
  • The Team is not providing the acceptance tests to the Customer for review and comment (see Quality Delivered  to Customer, Technical Debt Management),
  • The Team and Customer are not running acceptance tests together (see Quality Delivered  to Customer, Technical Debt Management),
  • The Customer(s) and other stakeholders are not invited/present at the Sprint Review (see Quality Delivered  to Customer),
  • The ScrumMaster isn't ensuring Customers are surveyed after each Sprint Review (see Quality Delivered  to Customer),
  • The Product Owner isn't available some portion of each day for scrum team for collaboration (see Scrum Team Adherence to Scrum Rules & Engineering Practices),
  • The Product Owner stories do not address features from the Customers' perspective e.g. stories are implementation specific  (see Scrum Team Adherence to Scrum Rules & Engineering Practices, Quality Delivered  to Customer),
  • Product Owner isn't providing information on the Customer 'pain' or needs to the Team (see Actual Stories Completed vs. Committed Stories),
  • Team is not documenting problems found or fixed (see Technical Debt Management),
  • Product Owner doesn't really understand what the customer wants and needs (see Quality Delivered  to Customer),
  • The Product Owner is not adequately communicating the customer needs to the development team  (see Quality Delivered  to Customer, Actual Stories Completed vs. Committed Stories)
  • Team is not conducting daily meeting  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting release planning   (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting sprint planning  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting a sprint review  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not conducting a retrospective  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying release burndown chart  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying sprint burndown chart  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying stories and acceptance criteria  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices),
  • Scrum Team is not visibly displaying non-function requirements that apply to entire sprint or release  (see Scrum Team's Adherence to Scrum Rules & Engineering Practices).

8.  Scrum Team's Adherence to Scrum Rules & Engineering Practices

The rules for scrum are defined in the Scrum Guide by Ken Schwaber and Jeff Sutherland. Although scrum doesn't prescribe engineering practices as does XP, most companies will have several of these defined for software engineering projects. The ScrumMaster is responsible for holding the Scrum Team accountable to these rules and any engineering practices defined. This metric can be measured by counting the infractions that occur during each sprint. A low score may indicate any of the following:
  • Management or Product Owner interfering with Team-forcing delivery before the Team is ready (see Technical Debt Management),
  • Weak or no 'Definition of Done' which includes zero introduced bugs (see Technical Debt Management),
  • Customer is not considered during sprint (see Technical Debt Management),
  • Team is creating bugs that reflect their opinion on how things should work rather than listening to the Customer (see Technical Debt Management),
  • Team is not documenting problems found or fixed (see Technical Debt Management)
  • Team is not engaging with Customer/Product Owner during user acceptance testing (see Technical Debt Management),
  • Team is being disrupted (see Actual Stories Completed vs. Committed Stories, Team Velocity),
  • ScrumMaster is not protecting Team from disruptions,
  • Team's reference feature stories are not applicable to the current release (see Team Velocity)
  • Not all customers and stakeholders are invited, show up, or participate in the sprint review (see Quality Delivered to Customer),
  • ScrumMaster is not ensuring that the Scrum Team adheres to Scrum values, practices, and rules,
  • ScrumMaster is not ensuring that the Scrum Team adheres to company, departmental, and Scrum Team engineering rules and practices,
  • ScrumMaster is not teaching the Scrum Team by coaching and by leading it to be more productive and produce higher quality products,
  • ScrumMaster is not helping the Scrum Team understand and use self-organization and cross-functionality,
  • ScrumMaster is not ensuring the Scrum Team has a workable 'Definition of Done' for stories,
  • ScrumMaster is not ensuring the Team has a daily meeting,
  • ScrumMaster is not ensuring the Team has the release burndown chart is prominently displayed,
  • ScrumMaster is not ensuring the Team has the sprint burndown chart is prominently displayed,
  • ScrumMaster is not ensuring the Team has the sprint stories and acceptance criteria are prominently displayed,
  • ScrumMaster is not ensuring sprint planning meeting is held,
  • ScrumMaster is not ensuring the sprint review meeting is held,
  • ScrumMaster is not ensuring the sprint retrospective meeting is held.

9.  Development Team's Understanding of Sprint Scope and Goal

The Team Understanding of Sprint Scope and Goal metric is a subjective measure of how well the Customer, Product Owner, and Team interact, understand, and focus on the sprint stories and goal. The sprint goal is broad and states the general intention of the sprint. The goal is usually an abstraction and is not testable as such but is aligned with the intended value to be delivered to the Customer at the end of the sprint. The sprint scope or objective is defined in the acceptance criteria of the stories. The stories and acceptance criteria are 'placeholders for a conversation' and will not be incredibly detailed. The Product Owner and Customer define the acceptance criteria but the Product Owner alone defines the sprint goal. The ScrumMaster can use the scoring from "Actual Stories Completed vs. Committed Stories" and "Team Velocity" as indications of problem understanding the stories and goal but is best determined through day-to-day contact and interaction with the Scrum Team. A low score may indicate any of the following:
  • Product Owner isn't providing enough information to the development team i.e. needs of the Customer (see Actual Stories Completed vs. Committed Stories, Team Velocity, Communication),
  • Product Owner is not writing a sprint goal,
  • Product Owner doesn't understand what an incremental development approach means,
  • Requirements scope creep: Product Owner is adding to the scope of the stories or adding additional acceptance criteria that were not agreed to during planning (see Actual Stories Completed vs. Committed Stories, Team Velocity, Communication),
  • Team doesn't understand the scope of work at the start of the sprint (see Team Velocity, Communication).

12 comments:

  1. Just one question, how could you measure Actual Stories Completed vs. Committed Stories in a five unit scale? What does mean 3? in this case?
    How it goes the Team Velocity? What 5? Etc..

    ReplyDelete
  2. Anonymous asked: How could you measure Actual Stories Completed vs. Committed Stories in a five unit scale?

    The Actual Stories Completed vs. Committed Stories KPI can be used to measure the team’s understanding of their own capabilities. I assume that the measure is taken as a running average over some number of sprints; the number of sprints should be agreed upon by the entire Scrum Team. The scale used is arbitrary and is most likely selected based upon the scale used throughout your company or your personal preference. I chose a scale of 5 on the spider chart example because that’s what I’m used to seeing but any range can be used. Of course, if the team falls short or exceeds the committed number of stories at the end of a sprint, the team should discuss the reasons and recommend solutions at their retrospective.

    Let me know if this helps - Bob

    ReplyDelete
  3. Well, the aim of the The Actual Stories Completed vs. Committed Stories KPI is clear for me. But how will be the metrics formulated? All the team members give an answer, formally a point between 1 to 5, where 1 is for the worst performance (actual stories completed <<< committed stories), and 5 is the maximum performance (actual stories completed > committed stories). Am I right?

    Mark

    ReplyDelete
  4. I would probably not automatically give the team a 5 if they completed more stories than they took on. The KPI is meant to indicate the team’s ability to know what they can do. It also indirectly tracks the team’s ability to do what’s necessary in order to understand the user story’s scope and magnitude. The Nokia Test gives the higher score to a team with an estimation error of less than 10%. Over committing or under committing by greater than 10% would most likely indicate the team has room for improvement. - Bob

    ReplyDelete
  5. Hi,

    These are really useful set of metrics but would like to know how exactly you have measured the score for these specially Retrospection, Technical Debt , Team Enthusiasm etc . I am interested to know how have you quantified these KPIs. Please do share as I would want to use within my teams.
    Regards
    Sonali

    ReplyDelete
  6. Hi Sonali,

    The measures I would recommend for the Retrospective Process Improvement, Technical Debt Management, and Team Enthusiasm KPI's are listed below. These measures may or may not suit your specific circumstances but are general enough to give your teams some insights on how they're doing.

    Retrospective Process Improvement

    • Count of improvement items identified in the Retrospective - 1 or more is expected.
    • Count of improvement items the Team had committed to addressing in the next sprint/the number of items resolved by the end of the sprint - all items committed to should be completed at the end of the sprint,
    • Scrum Master provides their opinion of team's enthusiasm to improve itself (1-5).

    For example, the Development Team has decided to increase their automated acceptance test coverage of stories from 25% to 40% in the next sprint. If, by the end of the next sprint, the team is automating 35% then they would probably score 4 on the 1-5 scale.

    Technical Debt Management

    • Count of defects at the beginning and end of the sprint - there should not be more defects at the end of the sprint than were at the start. You would need to take into account any product backlog items specifically to reduce technical debt that are in the sprint.

    For example, the number of outstanding defects for the Development Team remains unchanged after the sprint would score 5 on a 1-5 scale.

    Team Enthusiasm

    • Each Development Team member scores themselves on how much they're enjoying their work (1-5). Scrum Master surveys the Team in the retrospective.
    • Scrum Master provides his or her opinion of team enthusiasm through observations of the Development Team during the various sprint meetings; planning, review, retrospective, and daily stand-up (1-5).

    ReplyDelete
  7. interesting blog. It would be great if you can provide more details about it. Thanks you



    Scrum Proces

    ReplyDelete
  8. Congratulations! This is the great things. Thanks to giving the time to share such a nice information.
    Agile Training

    ReplyDelete
  9. Thank you for all this info! it's really helpful to me.

    What tool(s) do you recommend to make metrics process gathering easier to get as well as to manage the whole scrum project?

    ReplyDelete
  10. Personally I think tracking metrics is essential as it enables you to foresee possible issues so that you can take action proactively, rather than do firefighting later.

    We're currently in the process of coming up with a set of metrics/visualisations that can support agile development process. The idea is to have them on rotation on a large display. If interested, the current screenshots can be seen at http://screenful.me/tour/

    ReplyDelete
  11. Hello,
    The Information which you provided is very much useful for Agile Training Learners. Thank You for Sharing Valuable Information.

    ReplyDelete
  12. Thanks Bob.
    Your post was very helpfull to me.

    ReplyDelete

Note: Only a member of this blog may post a comment.