User Stories and Such
I have been attending sessions and tutorials from the Agile track at SD West for the most part. I am interested in learning about how the experts recommend we write user stories, acceptance tests, AMDD and Agile estimation and planning. This post is about the ‘From Stories to Automated Acceptance Tests’ session by Brett Schuchert. Here are some recommendations I have gathered:
- We need acceptance tests in the stories to let the devs know when they are ‘done’. We already do this, and our devs are usually good about pointing out when acceptance tests are incomplete or vague.
- Acceptance tests should be run on a staging machine before we mark the story as complete.
- Acceptance tests should not be technology specific. “Click the OK button” implies there is an OK button, which is a web design call, not a BA call. Instead the BA should state the acceptance criteria to be “The user indicates he is done”
- An acceptance test should have a few assertions. Many assertions in an acceptance test is an indication that the story needs to be be split up.
- Too much mocking can lead to integration failures.
- User stories should be specific, concrete rather than an abstraction. Examples may be provided to illustrate the point. A good story should not be vague, should not be time bound (some event that might occur in the future is an example of this – this is not testable), should be specific and not too broad.
- Include stories of conflict or error that we would like to handle.
- Remember INVEST and SMART
Acceptance test automation: There is a lot of talk at the conference about FitNesse and Slim. I saw some examples of these being used for acceptance tests. Since FitNesse is wiki based, we can add text or images to describe what the purpose of the tests are. This additional information is ignored by FitNesse. Also, the FitNesse configs should not be machine specific (should not include reference to paths, etc.)
By: Veena Mankidy