ThoughtSpot acquires Mode to define the next generation of collaborative BI >>Learn More

Additional Resources

template

Defining Metrics: A Template 
for Working With Stakeholders

Guide

10 Things Modern Business 
Intelligence Should Enable Today

video

Watch a Product Tour of Mode

Get started with Mode

Overplanned Analytics Initiatives Are Doomed to Fail

Image of author
Benn Stancil, Co-founder & Chief Analytics Officer

January 14, 2019Updated on July 8, 2022

NaN minute read

overplanned analytics

How do you become a company that has data to a company that truly uses its data?

Ship value with incremental analysis. Scrap the 9-month plan and commit to shorter delivery cycles.

As much as we hope for everyone's analytics initiatives to succeed, some fail. And when they do, they fail for all kinds of reasons. The company might change its strategic direction, the analytics initiative might be underfunded, or the initiative's owner might leave the organization. But these challenges aren't unique to companies that struggle to get their programs off the ground—the road to changing the fundamental decision-making culture of an organization always has twists and turns. So what separates the ones that navigate them successfully from those that don't?

From all the conversations we've had, one signal has emerged as the clearest indicator of likely success: The analytics team is agile, and they constantly deliver incremental progress.

A Tale of Two Teams

This idea is best explained through an example. We were recently in conversations with two separate major media companies, both of whom wanted to rebuild their analytics department from the ground up.

Company A—The "grand plan" approach

One company set aside six months for strategic planning, building ETL infrastructure, and software procurement. At the end of that cycle, they'd begin building carefully spec'ed dashboards. They hoped to unveil a whole new analytics platform in about nine months.

This company started from the same position as many others seeking to build an analytics program from scratch: full of ideas and not sure which would be most valuable for their business. To control for this uncertainty, they tried to build an end-to-end plan for the entire project.

They weren't sure exactly what questions they'd need to answer nine months down the road, so they tried to anticipate the right ones. They weren't sure what data they'd need, so they built out a precise plan for tracking everything they thought they might want. They didn't have any company-wide metrics, so they held a whole series of meetings to figure out exactly what their ideal dashboards would show.

To control for this uncertainty, they tried to build an end-to-end plan for the entire project.

During the planning process, the company got more and more excited about the bold vision. Decisions were being made. Stuff was getting done. The analytics team was in control.

Unfortunately, grand plans like this rarely survive contact with reality. Technical challenges force teams to scrap their timelines or even entire projects. New products, new marketing campaigns, and new business goals require unforeseen metrics. Strategies have to be rewritten, setting projects back. Other disruptions, like departures of key stakeholders or executives turning their focus on other initiatives, can be fatal.

Predictably, this first company's initiative failed. Much to the disappointment of the people who were anticipating the promised benefits, the project was abandoned after months of wasted effort.

Company B—An agile, adaptable approach

Around the same time, I was having a nearly identical conversation with another large media organization. They too were working on launching an analytics program. They too felt a great deal of uncertainty around how best to create something valuable. But this company approached that uncertainty with a very different attitude.

This second team convinced company leadership to give them a few weeks of runway. In that timeframe, they promised to deliver something small. From there, they'd assess the impact of their work, and re-evaluate their priorities. They'd find the next most important problem they could address with analytics, and tackle that.

They believed that ambiguity could serve as a guide rather than a roadblock:

  • Don't have metrics yet? Think about what today's biggest challenges are, and build metrics that better inform those needs.

  • Don't know what technology to use? Start with data infrastructure that can provide value quickly, and with low setup costs. Make better tooling decisions in the future with the experience earned from using that technology.

They had no master plan for rallying the company, but created something far more valuable: consistent, concrete progress. Each new dashboard shipped, each new question answered, created demand for more. If they shipped something that turned out not to be useful, they were able to regroup and determine why.

They had no master plan for rallying the company, but created something far more valuable: consistent, concrete progress.

This agile, incremental approach proved far more effective and durable than the first team's heavily planned approach. The second company knew that change is inevitable. A rigid system will break under that stress. Instead, they designed a system that responds to turbulence the same way the wings on a plane respond: by safely flexing and bouncing through it.

How to keep analytics agile (or airborne)

To keep your analytics projects moving forward, analysts can take a cue from software developers. Modern software teams have overwhelmingly rejected monolithic waterfall models of development in favor of more flexible agile methods. These principles guided the successful team's approach and inspired several concrete tips for avoiding the fate of the failed team.

1. Never go two weeks without shipping

When working on big analytics projects, especially when building out an analytics infrastructure for the first time, it's easy to get lost in heads-down work. To avoid this trap, make sure you're shipping something to your customers (i.e., people that aren't on the analytics team) at least every two weeks. This not only forces you to deliver value in small iterations, but it also shows constant progress to leaders who (like it or not) are likely wondering if this analytics investment is worth it.

2. Build prototypes

We constantly hear stories about analysts who worked hard to build a dashboard or a reporting tool only to find that it never got used once they shipped it. By building lightweight prototypes that your customers can use before building the final tool, you can get feedback on what works and what doesn't. You'll be surprised by how often your initial ideas change—and how often your customers' requests change—as soon as they see the real data in a dashboard.

3. Mentally stress test your work

Periodically think back over the last six months or so: What's changed about your company between then and now? How have priorities changed? What unexpected things happened? Now, think about what you're building. If the same amount of change and disruption happens over the next six months, will your work survive? Will it be easy to adapt and evolve? If not, consider approaches and tools that are more flexible, or make sure you aren't over-investing in a foundation that won't be useful six months later.

When starting an analytics program, a master plan is comforting. But the road to success often follows a less-predictable path.

Get our weekly data newsletter

Work-related distractions for data enthusiasts.

Additional Resources

template

Defining Metrics: A Template 
for Working With Stakeholders

Guide

10 Things Modern Business 
Intelligence Should Enable Today

video

Watch a Product Tour of Mode

Get started with Mode