Marrying McClure and McKinsey

On arriving at an online measurement framework for capturing and supporting consumers in their decision journey

The classic marketing funnel is dead. McKinsey & Company among others has written on its demise in the new reality of fragmented channels and empowered consumers, and the consultancy’s alternative Consumer Decision Journey (CDJ) framework in fact describes a reality in which a consumer’s consideration set and sources broaden instead of narrow during the leadup to purchase. It also describes post-purchase behavior and opportunity. It’s stuff we intuit, know, and live in our digital, social reality. To the degree that brand and ecom sites contribute authentically and appropriately along the consumer journey, they remain relevant resources. And to the extent that product meets expectation, loyalty and recommendation may be garnered.

The CDJ breaks out into the following, asynchronous (over time and across channels) phases:

Figure 1 Consumer Decision Journey (CDJ) illustration, courtesy of Mutual Mind

The CDJ framework typically gets positioned as an informant of content development and placement. Inversely, however, it also describes a lifecycle whose stages we should listen and optimize for with our online properties. Consumers may come to us at any point in the journey. Given as much, we need a way to baseline and measure our effectiveness in supporting them at each. Tweaking Dave McClure’s famous pirate framework for terminology and, more significantly, to position purchase in the middle of the lifecycle, gives us the following depiction of the journey stages at which users may arrive on our site or to which they may progress during their visits.

Figure 2 Pirated CDJ: Marrying McClure and McKinsey. An AEPLE (“apple”) framework for online capture and support of consumers in the decision journey.

Yes, if we’re ecom our ultimate success metric is revenue. But core to the premise of the journey is acknowledgement of the fact that this happens over time, and that there are waypoints we can check off against along the way. Assuming the stages above as these waypoints, the zoomed-in view below provides hypothetical sets of micro conversions that can be tracked and used to record our efficacy in supporting each.

Figure 3 2-Dimensional scoring for site support of CDJ stages (renamed/consolidated per AEPLE)

An alternative, pivoted view provides visibility into sessions comprising category micro conversions and rolls up into an indicator of average session depth. In other words, the story we’re telling ourselves is that users come and go and don’t convert through the entire lifecycle in a single site visit. The rollup describes the extent to which this is true. If no visit for a given period converts within Exploration, Purchase, Loyalty, or Evangelism categories, the rolled up score is 0.2. If all visits convert within each, it’s 1. In the wild, we’ll be between the two scores, and all but certainly closer to the first.

Figure 4 Pivoted view: average session 'depth'

George Miller's seminal 1956 article, The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information, posited a very modest threshold for our common cognitive ability. In essence, in the context of dashboards and scorecards, we’re good at processing informational overviews as long as they remain at the overview level. Borrowing from the phased approach of McClure’s pirate framework and assigning distinct scores to stages as we describe the consumer journey allows us, in effect, to play along two dimensions of metrics. It both leaves us cognitively intact, and describes our success in terms of the consumer’s place in the lifecycle instead of with system-oriented metrics. Perhaps in a sign that we're moving in the right direction, it represents distillation among the fragmentation.


comments powered by Disqus