Oct 312014
 

Every historical era has its lessons, such as Don’t trust totalitarian dictators to respect diplomatic niceties, Avoid land wars in Asia, and You know what’s going to happen to Sean Bean in this movie. One of the lessons of the last decade is certainly Information is not intelligence. Unfortunately, many people who do software requirements, or depend on them to build and test software, have not seen the relevance of that maxim in their own work.

Requirements in software development serve much the same purpose as intelligence in national security: they are supposed to provide actionable, reliable insights. “Actionable” is largely a question of format, which software professionals can control directly. Older questions like, What is the Jesuitical distinction between a requirement and a specification? and newer questions like, What kind of words do we need to supplement the pictures that we’ve created in this wireframe? have the same purpose: make sure that the developer, tester, or some other species of technologist understands what action to take, based on the information provided. In a similar fashion, the US President’s daily intelligence briefing follows a format that its intended audience finds useful.

The reliability of the information is not under the complete control of software professionals. In fact, we should always assume that the information we have is, to some degree, unreliable. We can reduce the amount of unreliability, but it will never reach 100% certainty. People in the intelligence profession deal with this problem in a variety of ways. Here are a few examples:

  • Assume that first reports are wrong.
  • Don’t trust any single source of information.
  • Continue challenging the reliability of information, instead of treating it as established fact.

How do software professionals treat the problem of information unreliability? All too frequently, by acting as though it doesn’t exist. If you doubt that conclusion, ask yourself, how often do the people responsible for requirements in your organization ask the following questions. (For each question, I’ve provided a reason why it’s important to ask.)

  • Am I talking to the right person? The poor schlub telling you what the sales team needs in the new CRM system might be a new employee without the background to really know what all the line items in the wish-list spreadsheet mean.
  • Am I asking the right question? Leading questions like, “Do you like the new user experience?” elicit misleading answers.
  • Does anyone know the answer to this question? “Will the sales and marketing teams like the new CRM system’s UX?” is about as useful as asking, “If Superman and the Hulk were to fight, who would win?” Until people have an opportunity to use the software, there’s no definitive answer that anyone can provide.
  • Is this person being honest with me? Until you’ve establish genuine trust between customers and technologists, you have to treat requirements as a negotiation.
  • How out of date is this information? Requirements are a snapshot of how technology can provide business value. Both the business and the technology will change.
  • Are we understanding this information in context? Prioritization is the devil’s playground of misinterpretation. Is this set of functionality “critical” to the business because of an urgent need, such as addressing a competitor? Or is it truly important in a more strategic sense, such as a way to increase customer retention? The development team’s response may look very different, based on the answer.

Just as important as the frequency with which you ask these questions is the person who asks them. Crafting good requirements takes skill, experience, maturity, and cleverness. Not everyone is equally well-suited to create them, either on an individual level (the person who has the job title of product manager or business analyst, usually) or a team level (for instance, a newly Agile team that has never created a backlog before). An important rite of passage is the transition from gullibility (“The VP of sales said what she wanted, so what else do I need to know?”) to skepticism (“I know from experience that, a month from now, if I were to ask the VP of sales what she wants, I’d get a very different list”).

It should also be clear how ineffective some ways to circumvent these challenges can be. For example, when faced with the slowness and unreliability of the requirements process, putting the customers and technologists in the same room won’t necessarily improve matters.

Although I’ve downplayed the question of format, it does matter, to the extent that it increases or decreases reliability. As Alistair Cockburn pointed out, the reliability of communication depends, to a great degree, on the channel. The weight of the format also plays an important role: the heavier the format, the less likely you are to treat requirements as a set of hypotheses that demand regular testing.

At this point in the presentation, I often see people nodding. However, I worry that they don’t go back to their organizations and demand the changes necessary. You may not have a staff of people capable of doing this work, just as not anyone can be a good spy. You may not give people the opportunity to ask the right questions, and you may penalize people for providing uncomfortable answers. You may rely too much on single-sourced intelligence, when you should be cultivating new sources. (Analytics from working software is a personal favorite.) We know what happens when civilian and military decision-makers misuse intelligence. Why would you be immune to the same results?

 

avatar

Tom Grant

Tom Grant is the former Practice Director of Cutter Consortium's Agile Product Management & Software Engineering Excellence practice. His expertise lies in software development and delivery, with a particular focus on Agile, Lean, application lifecycle management (ALM), product management, serious games, collaboration, innovation, and requirements.

Discussion

 Leave a Reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

(required)

(required)