How long should our sprints be? This is a question I am frequently asked by new scrum masters and scrum teams. Here is how it showed up in my in-box recently.
After we participated in Agile Learning Labs’ Certified Scrum Master (CSM) workshop, my colleagues and I have begun practicing scrum very seriously. We chose one week as our sprint length. Some developers feel one-week sprints are too short, since we have a very strong definition of done. Delivering visible work in one week, along with all of the time in scrum meetings, is too stressful. One team member suggested increasing our sprint length to two-weeks. What are your thoughts?
Thanks for the question! The short answer is keep your sprints short; find and fix the sources of the stress you are feeling. All too frequently, when scrum uncovers a problem, we seek to change the way we are doing scrum in order to cover the problem back up. Have a look at this post about story point accounting for another example of this tendency. A better response is to address the underlying root-causes of the problem.
For your team, it is unlikely the underlying problem is not enough time in a one-week sprint to get user stories done. More likely, the team is dealing with one or more of the following problems: Read the full article…
I was inspired to create a retrospective game for agile teams, based on the game Dixit. Dixit is a game that makes use of picture cards. Each of these cards has an unusual drawing on it. The Agile Learning Labs team used it recently in one of our sprint retrospectives and it worked well. Give it a try with your team and leave a comment to let me know how it works for you.
The Team Estimation Game plays like a game, but it accomplishes valuable work: assigning story point estimates to user stories.
Teams using this technique are typically able to estimate 20 to 60 stories in an hour. The game was invented by our friend and colleague, Steve Bockman. Here is how one team plays the game: Read the full article…
Seriously, this video (via David Chilcott, via Mitchell Levy) makes me think: I want to do this with people some day. It may be in software, or it may be in publishing, it may be in basket weaving (it certainly won’t be in guitar playing or singing), but I want to be one of these guys!
Our friend and colleague David Parker is leaving Agile Learning Labs’ staff. He has received a much better offer–and one we can’t possibly counter–that of stay at home dad to Chase Kamran Parker-Katiraee, who assumed his post of infant-in-chief earlier this week.
We predict a fair bit of wrangling over just who is the customer and who the product owner on this particular project, but anticipate that development will flourish nonetheless. If we’re lucky, David and his wife Layla will supply us with lots and lots of adorable sprint demos along the way. Our compliments to the team!
We’ve had a particularly busy month here at Agile Learning Labs–the phone is ringing off the hook, so to speak, and our sales and biz dev team has been very, um, agile. As in light on their feet. We thought they might need a good laugh, so this is a thank you to Steve and Laura! Party on, peeps!
Thank you for the certified scrum master training last week in Beijing. Your training is very impressive, and I appreciate it a lot. I asked you a lot of questions; may I ask one more? In our company, the automation for regression tests hasn’t been set up, yet. Without automation of the regressions tests, unit test, and pair-programming, how can our scrum team improve the quality of the product?
First, let me encourage you to keep up the work to automate your regression tests. Few things have as big a return on investment. Test automation enables the team to move much faster and make improvements fearlessly. The other practices you mention: unit testing and pair programming, are also great practices, and I encourage your team to try them too.
Having said that, your question was what else could your team do. Additional practices I would recommend your team consider are: code reviews, frequent testing by real users, testing bashes, and whole-team ownership of quality and testing.