12 Comments
User's avatar
Robert Brown's avatar

“First, enumerate decisions…”

I’d suggest this should be the second step. The first step should be to identify what the problem/opportunity is and why it matters. Over the course of my consulting career, I’ve noticed that practically no one takes this initial step to define the problem clearly and identify the values and objectives the organization wants to satisfy by addressing the problem at hand (there may be multiple hierarchically related objectives, and some of them might be in opposition to others). A really good visual tool to facilitate mapping this is the objectives hierarchy. (See Ralph Keeney’s book Value-focused Thinking.) The end result of this step provides clarity about purpose and scope, clarifies potentially conflicting goals and objectives that emerge from siloed groups initially expressing too narrow or too privileged of a focus about the problem, and emerges potential decision alternatives that can be combined into decision strategies that lead into the next step.

Expand full comment
MC's avatar

Yes! I shouldn't assume that teams are actually solving for the correct problem, so it makes total sense to have that as the first step.

I often talk about the necessity of spending time upfront to clarify the real problem to fix!

Expand full comment
Robert Brown's avatar

I think the old adage is that a problem well defined is a problem half solved. I think it’s closer to 80%. ☺️

Expand full comment
Beth's avatar

Right on but maybe one additional nuance - GOOD consultants not only can figure out what the data means and how it helps their client, but they should be incredibly adept at (quickly) communicating it. We all have been in the room when an amazingly important strategy is lost in the delivery.

Expand full comment
Robert Brown's avatar

“The pursuit of perfect information in corporate life is the most expensive hobby.”

This is the thing that is often more expensive than solving an irrelevant problem.

The cool thing is that value of information (VoI) can actually be calculated to ensure that decision teams know when to stop allocating resources in the pursuit of better information. Or, another way to put it: VoI tells you the upper bound on how much to spend to obtain just enough information about a critical uncertainty to make a distinction between one decision pathway over another one.

Expand full comment
MC's avatar

Any reference on possible ways to quantify VoI? The approach is sensible, I haven't seen it applied myself but keen to learn more about it.

Expand full comment
Robert Brown's avatar

The best and deepest presentation and discussion of this is in “Foundations of Decision Analysis” by Ronald Howard and Ali Abbas.

McNamee and Celona discuss it in “Decision Analysis for the Professional” (pg 58 - 62) as a decision tree exercise. (You can get this book for free at https://smartorg.com/wp-content/uploads/2021/01/Decision-Analysis-for-the-Professional.pdf) I’m pretty sure they studied under Howard at Stanford.

In “Business Case Analysis with R,” I show how to convert the VoI decision tree problem into a matrix calculation. This article provides the set up (https://medium.com/@robertdbrowniii/business-case-analysis-with-r-9d9ba3d00c15). The actual numerical approach is described in section IV of the book.

Expand full comment
Robert Brown's avatar

“That framing feels pragmatic but can quickly become expensive. You end up bending the question to the data rather than shaping the data to the question.”

Bending the question to the data is exactly why ~85% data science projects fail or remain in the category of a science fair project with little business relevance. There are few things more expensive in the business world than solving an irrelevant problem.

Expand full comment
MC's avatar

Fully agree. Is the ~85% based on your experience or some research you found? It "feels" a bit high, but of course directionally correct!

Expand full comment
Robert Brown's avatar

Numerous studies reach similar conclusions, like the Chaos Report and Gartner. Gartner originally said the failure rate was ~60%, but later updated it to ~85%.

Expand full comment
Kenny Fraser's avatar

The trouble is the question always changes and you need to be able to reconfigure or resort the data to answer each new question. That's the skill and for me the big question is how well AI can extract patterns from data that relevant in specific business contexts.

As or sources, I always go back to TS Eliot: "Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?"

Expand full comment