Archive

Author Archive

Agile Model Driven Development (AMDD)

March 17, 2009 1 comment

One of the sessions I attended at SD West was ‘Agile Model Driven Development (AMDD) by Scott Ambler. Scott started the session by having us form groups of 4-5 people, and then gave us 3 assignments to work on – one each in the Traditional, Mini-Waterfall and Agile methodologies. At the end of the 3 assignments we compared the team ‘scores’, and most teams did best in the Agile assignment. We then had a mini retrospective of what went well and not so well in all 3 approaches. One of the important revelations was that each team interpreted the same assignment a little differently.

Agile Modeling (AM)

  • AM is a chaordic, practices-based process for modeling and documentation
  • AM is a collection of practices based on several values and proven software engineering principles
  • AM is a light-weight approach for enhancing modeling and documentation efforts for other software processes
  • Types of Agile Models: The type of modeling is not important. We can use any tool that works for us. The model can be represented in UI sketch or sticky notes on a whiteboard, as Acceptance Tests or a Domain Model or a UML Sequence Diagram.

    Agile models:
    • Fulfill their purpose
    • Are understandable
    • Are sufficiently accurate
    • Are sufficiently consistent
    • Are sufficiently detailed
    • Provide positive value
    • Are as simple as possible

    Agile models are just barely enough!

    Scott presented some comparisons between the Traditional, Waterfall, Iterative and Agile approaches. The data, slide decks, and original questions can be downloaded from www.ambysoft.com/surveys/. One interesting slide provides data that shows Agile teams do quite a bit of planning, which dispels the misconception that Agile teams don’t plan! Most Agile teams plan using sketches on a whiteboard or similar; some teams capture this information (usually digitally).

    The session also included some information on Agile Documentation:
    How CRUFTy Are Your Documents?
    Calculating the effectiveness of a document: Effectiveness = C * R * U * F * T
    Where:
    C = % of the document that is Correct
    R = Chance that the document will be Read
    U = Chance that the material will Understood
    F = Chance that the material will be Followed
    T = Chance that the document will be Trusted

    What are Agile Documents?

    • Focus on stable, not speculative concepts
    • Are executable first, static only if you have to
    • Maximize stakeholder ROI
    • Are concise
    • Fulfill a purpose
    • Describe information that is less likely to change
    • Describe “good things to know”
    • Have a specific customer and facilitate the work efforts of that customer
    • Are sufficiently accurate, consistent, and Detailed

    Some other useful resources on this topic are:

    By: Veena Mankidy

    Advertisements

    Keynotes at SD West

    March 12, 2009 Comments off

    Robert C. Martin gave a keynote address on Monday at SD West  titled ‘Extreme Programming: After 10 years, why are we still talking about it?’ and the theme song for the talk was ‘I used to rule the world’ by Coldplay!!  Extreme Programming has been adapted to such an extent in the industry that it is not discussed much any more, and hence may appear to be ‘dead’. This is similar to object oriented design – nobody talks about it anymore because everyone uses those principles. He also mentioned the Manifesto for Software Craftsmanship if you would like to take a look.

    Manifesto for Software Craftsmanship

    Manifesto for Software Craftsmanship

    The Wednesday keynote was ‘Software Development Strategies, Philosophies, and Techniques: Traditional vs. Agile’ by Scott Ambler and Terry Quatrani. It was a very entertaining talk in the form of a parody with Scott acting out as the Windows guy (and Traditional approach) and Terry as the Mac (and Agile) person! They presented data based on a Dr. Dobbs survey from late 2008 that shows the difference between the Traditional, Ad-hoc, Agile and Iterative approaches, with the Agile and Iterative approaches doing so much better (as expected).

    Coding monkey vs the jack of all trades

    Coding monkey vs the jack of all trades

    Success rates of the different approaches

    Success rates of the different approaches

    By: Veena Mankidy

    User Stories and Such

    March 10, 2009 1 comment

    StoryI have been attending sessions and tutorials from the Agile track at SD West for the most part. I am interested in learning about how the experts recommend we write user stories, acceptance tests, AMDD and Agile estimation and planning. This post is about the ‘From Stories to Automated Acceptance Tests’ session by Brett Schuchert. Here are some recommendations I have gathered:

    • We need acceptance tests in the stories to let the devs know when they are ‘done’.  We already do this, and our devs are usually good about pointing out when acceptance tests are incomplete or vague.
    • Acceptance tests should be run on a staging machine before we mark the story as complete.
    • Acceptance tests should not be technology specific. “Click the OK button” implies there is an OK button, which is a web design call, not a BA call. Instead the BA should state the acceptance criteria to be “The user indicates he is done”
    • An acceptance test should have a few assertions. Many assertions in an acceptance test is an indication that the story needs to be be split up.
    • Too much mocking can lead to integration failures.
    • User stories should be specific, concrete rather than an abstraction. Examples may be provided to illustrate the point. A good story should not be vague, should not be time bound (some event that might occur in the future is an example of this – this is not testable), should be specific and not too broad.
    • Include stories of conflict or error that we would like to handle.
    • Remember INVEST and SMART

    Acceptance test automation: There is a lot of talk at the conference about FitNesse and Slim. I saw some examples of these being used for acceptance tests. Since FitNesse is wiki based, we can add text or images to describe what the purpose of the tests are. This additional information is ignored by FitNesse. Also, the FitNesse configs should not be machine specific (should not include reference to paths, etc.)

    By: Veena Mankidy