spacer

Agile Requirements Modeling

spacer
 
   Home  |  AMDD  |  Best Practices  |  Architecture  |  Requirements  |  Analysis  |  Design  |  Documentation  |  Models  |  Modeling Style  |  Contact Us  |  Announcements  |  FAQ
Many traditional project teams run into trouble when they try to define all of the requirements up front, often the result of a misguided idea that developers will actually read and follow what the requirements document contains.  The reality is that the requirements document is usually insufficient, regardless of how much effort goes into it, the requirements change anyway, and the developers eventually end up going directly to their stakeholders for information anyway (or they simply guess what their stakeholders meant).  Agilists know that if they have the ability to elicit detailed requirements up front then they can also do the same when they actually need the information.  They also know that any investment in detailed documentation early in the project will be wasted when the requirements inevitably change.  Agilists choose to not waste time early in the project writing detailed requirements documents because they know that this is a very poor way to work.  

 

Table of Contents

  1. Agile requirements modeling in a nutshell
    • Initial requirements envisioning
    • Iteration modeling
    • JIT model storming
    • Acceptance test-driven development (ATDD)
  2. Where do requirements come from?  
  3. Best practices 
  4. Types of requirements
  5. Potential requirements artifacts
  6. Techniques for eliciting requirements
  7. Common requirements modeling challenges  
  8. Agile requirements change management

 

1. Agile Requirements Modeling in a Nutshell

Figure 1 depicts the Agile Model Driven Development (AMDD) lifecycle, which depicts how Agile Modeling (AM) is applied by agile software development teams.  The critical aspects which we're concerned about right now are initial requirements modeling, iteration modeling, model storming, and acceptance test-driven development (ATDD).  The fundamental idea is that you do just barely enough modeling at the beginning of the project to understand the requirements for your system at a high level, then you gather the details as you need to on a just-in-time (JIT) basis.

 

Figure 1. The AMDD lifecycle.

spacer

 

1.1 Initial Requirements Modeling

At the beginning of a project you need to take several days to envision the high-level requirements and to understand the scope of the release (what you think the system should do).  Your goal is to get a gut feel for what the project is all about, not to document in detail what you think the system should do: the documentation can come later, if you actually need it.  For your initial requirements model my experience is that you need some form of:

  1. Usage model.  As the name implies usage models enable you to explore how users will work with your system.  This may be a collection of essential use cases on a Rational Unified Process (RUP) project, a collection of features for a Feature Driven Development (FDD) project, or a collection of user stories for an Extreme Programming (XP) project. 
  2. Initial domain model.  A domain model identifies fundamental business entity types and the relationships between then.  Domain models may be depicted as a collection of Class Responsibility Collaborator (CRC) cards, a slim UML class diagram, or even a slim data model.  This domain model will contain just enough information: the main domain entities, their major attributes, and the relationships between these entities.  Your model doesnt need to be complete, it just needs to cover enough information to make you comfortable with the primary domain concepts.  
  3. User interface model.  For user interface intensive projects you should consider developing some screen sketches or even a user interface prototype.

What level of detail do you actually need?  My experience is that you need requirements artifacts which are just barely good enough to give you this understanding and no more.  For example, Figure 2 depicts a simple point-form use case.  This use case could very well have been written on an index card, a piece of flip chart paper, or on a whiteboard.  It contains just enough information for you to understand what the use case does, and in fact it may contain far too much information for this point in the lifecycle because just the name of the use case might be enough for your stakeholders to understand the fundamentals of what you mean.  Figure 3, on the other hand, depicts a fully documented formal use case.  This is a wonderful example of a well documented use case, but it goes into far more detail than you possibly need right now. If you actually need this level of detail, and in practice you rarely do, you can capture it when you actually need to by model storming it at the time.  The longer your project team goes without the concrete feedback of working software, the greater the danger that you're modeling things that don't reflect what your stakeholders truly need.

 

The urge to write requirements documentation should be transformed into an urge to instead collaborate closely with your stakeholders and then create working software based on what they tell you.

 

Figure 2. A point-form use case.

Name: Enroll in Seminar

Basic Course of Action:

  • Student inputs her name and student number

  • System verifies the student is eligible to enroll in seminars. If not eligible then the student is informed and use case ends.

  • System displays list of available seminars.

  • Student chooses a seminar or decides not to enroll at all.

  • System validates the student is eligible to enroll in the chosen seminar.  If not eligible, the student is asked to choose another.

  • System validates the seminar fits into the students schedule.

  • System calculates and displays fees

  • Student verifies the cost and either indicates she wants to enroll or not.

  • System enrolls the student in the seminar and bills them for it.

  • The system prints enrollment receipt.

 

Figure 3. A detailed use case.

Name: Enroll in Seminar

Identifier: UC 17

Description:

Enroll an existing student in a seminar for which she is eligible.

 

Preconditions:

The Student is registered at the University.

 

Postconditions:

The Student will be enrolled in the course she wants if she is eligible and room is available.

 

Basic Course of Action:

1.        The use case begins when a student wants to enroll in a seminar.

2.        The student inputs her name and student number into the system via UI23 Security Login Screen.

3.        The system verifies the student is eligible to enroll in seminars at the university according to business rule BR129 Determine Eligibility to Enroll. [Alt Course A]

4.        The system displays UI32 Seminar Selection Screen, which indicates the list of available seminars.

5.        The student indicates the seminar in which she wants to enroll. [Alt Course B: The Student Decides Not to Enroll]

6.        The system validates the student is eligible to enroll in the seminar according to the business rule BR130 Determine Student Eligibility to Enroll in a Seminar. [Alt Course C]

7.        The system validates the seminar fits into the existing schedule of the student according to the business rule BR143 Validate Student Seminar Schedule.

8.        The system calculates the fees for the seminar based on the fee published in the course catalog, applicable student fees, and applicable taxes. Apply business rules BR 180 Calculate Student Fees and BR45 Calculate Taxes for Seminar.

9.        The system displays the fees via UI33 Display Seminar Fees Screen.

10.     The system asks the student if she still wants to enroll in the seminar.

11.     The student indicates she wants to enroll in the seminar.

12.     The system enrolls the student in the seminar.

13.     The system informs the student the enrollment was successful via UI88 Seminar Enrollment Summary Screen.

14.     The system bills the student for the seminar, according to business rule BR100 Bill Student for Seminar

15.     The system asks the student if she wants a printed statement of the enrollment.

16.     The student indicates she wants a printed statement.

17.     The system prints the enrollment statement UI89 Enrollment Summary Report.

18.     The use case ends when the student takes the printed statement.

 

Alternate Course A: The Student is Not Eligible to Enroll in Seminars.

A.3. The registrar determines the student is not eligible to enroll in seminars.

A.4. The registrar informs the student he is not eligible to enroll.

A.5. The use case ends.

 

Alternate Course B: The Student Decides Not to Enroll in an Available Seminar

B.5. The student views the list of seminars and does not see one in which he wants to enroll.

B.6. The use case ends.

 

Alternate Course C: The Student Does Not Have the Prerequisites

C.6. The registrar determines the student is not eligible to enroll in the seminar he chose.

C.7. The registrar informs the student he does not have the prerequisites.

C.8. The registrar informs the student of the prerequisites he needs.

C.9. The use case continues at Step 4 in the basic course of action.

 

 

1.2 Iteration Modeling

Iteration modeling is part of your overall iteration planning efforts performed at the beginning of an iteration.  You often have to explore requirements to a slightly more detailed level than you did initially, modeling just enough so as to plan the work required to fulfill a given requirement.

 

1.3 Model Storming

Detailed requirements are elicited, or perhaps a better way to think of it is that the high-level requirements are analyzed, on a just in time basis.  When a developer has a new requirement to implement, perhaps the "Enroll in Seminar" use case of Figure 2, they ask themselves if they understand what is being asked for.  In this case, it's not so clear exactly what the stakeholders want, for example we don't have any indication as to what the screens should look like.  They also ask themselves if the requirement is small enough to implement in less than a day or two, and if not then the reorganize it into a collection of smaller parts which they tackle one at a time.  Smaller things are easier to implement than larger things.

To discover the details behind the requirement, the developer (or developers on project teams which take a pair programming approach), ask their stakeholder(s) to explain what they mean.  This is often done by sketching on paper or a whiteboard with the stakeholders.  These model storming sessions are typically impromptu events and typically last for five to ten minutes (its rare to model storm for more than thirty minutes because the requirements chunks are so small).  The people get together, gather around a shared modeling tool (e.g. the whiteboard), explore the issue until their satisfied that they understand it, and then they continue on (often coding). Extreme programmers (XPers) would call requirements modeling storming sessions "customer Q&A sessions".

In the example of identifying what a screen would look like, together with your stakeholder(s) you sketch what the want the screen to look like, drawing several examples until you come to a common understanding of what needs to be built.  Sketches such as this are inclusive models because youre using simple tools and modeling techniques, this enabling the Agile Modeling (AM) practice of Agile Stakeholder Participation.  The best people to model requirements are stakeholders because they're the ones who are the domain experts, not you.

For a detailed example of how to go about requirements modeling, read the article Agile Requirements Modeling Example.

 

1.4 Acceptance Test Driven Development (ATDD)

Test-driven development (TDD) (Beck 2003; Astels 2003), is an evolutionary approach to development which that requires significant discipline and skill (and good tooling). The first step is to quickly add a test, basically just enough code to fail.  Next you run your tests, often the complete test suite although for sake of speed you may decide to run only a subset, to ensure that the new test does in fact fail.  You then update your functional code to make it pass the new tests.  The fourth step is to run your tests again.  If they fail you need to update your functional code and retest.  Once the tests pass the next step is to start over (you may first need to refactor any duplication out of your design as needed).  As Figure 4 depicts, there are two levels of TDD:

  1. Acceptance TDD (ATDD).  With ATDD you write a single acceptance test, or behavioral specification depending on your preferred terminology, and then just enough production functionality/code to fulfill that test.  The goal of ATDD is to specify detailed, executable requirements for your solution on a just in time (JIT) basis. ATDD is also called Behavior Driven Development (BDD).

  2. Developer TDD. With developer TDD you write a single developer test, sometimes inaccurately referred to as a unit test, and then just enough production code to fulfill that test.  The goal of developer TDD is to specify a detailed, executable design for your solution on a JIT basis.  Developer TDD is often simply called TDD.

Figure 4. How ATDD and developer TDD fit together.

spacer

With ATDD you are not required to also take a developer TDD approach to implementing the production code although the vast majority of teams doing ATDD also do developer TDD. As you see in Figure 4, when you combine ATDD and developer TDD the creation of a single acceptance test in turn requires you to iterate several times through the write a test, write production code, get it working cycle at the developer TDD level. Clearly to make TDD work you need to have one or more testing frameworks available to you.  For acceptance TDD people will use tools such as Fitnesse or RSpec and for developer TDD agile software developers often use the xUnit family of open source tools, such as JUnit or VBUnit Commercial testing tools are also viable options.  Without such tools TDD is virtually impossible. 

There are several important benefits of ATDD.  First, the tests not only validate your work at a confirmatory level they are also in effect executable specifications that are written in a just-in-time (JIT) manner.  By changing the order in which you work your tests in effect do double duty.  Second, traceability from detailed requirements to test is automatic because the acceptance tests are your detailed requirements (thereby reducing your traceability maintenance efforts if you need to do such a thing).  Third, this works for all types of requirements -- although much of the discussion about ATDD in the agile community focuses on writing tests for user stories, the fact is that this works for use cases, usage scenarios, business rules, and many other types of requirements modeling artifacts.

The greatest challenge with adopting ATDD is lack of skills amongst existing requirements practitioners, yet another reason to promote generalizing specialists within your organization over narrowly focused specialists.

2. Where Do Requirements Come From?

Your project stakeholders direct or indirect users, managers, senior managers, operations staff members, support (help desk) staff members, testers, developers working on other systems that integrate or interact with your, and maintenance professionals are the only OFFICIAL source of requirements (yes, developers can SUGGEST requirements, but stakeholders need to adopt the suggestions).  In fact it is the responsibility of project stakeholders to provide, clarify, specify, and prioritize requirements.  Furthermore, it is the right of project stakeholders that developers invest the time to identify and understand those requirements.  This is concept is critical to your success as an agile modeler it is the role of project stakeholders to provide requirements, it is the role of developers to understand and implement them.

Does that imply that you sit there in a stupor waiting for your project stakeholders to

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.