Building Product: Avoiding the Data-Driven Trap

I spent part of the weekend with some friends who are in the early stages of a new startup. While we don’t always talk shop, it’s inevitable that someone will open up about their latest challenges. The conversation we had ended up being interesting enough that I wanted to cover the topic with my readers.

The discussion started with a simple statement from a founder and chief marketing officer who is looking to disrupt the healthcare payment space. She made the statement “our approach to product is 100% data-driven”. She’s a fantastic marketer and has good instincts, but I couldn’t help myself, I had to ask what that means for their development process. You see, the term ‘data-driven’ has almost become a religion for many people in the tech industry, with websites promoting data-driven practices that often bemoan ‘mere gut instinct’. These days it seems like questioning data-driven processes can almost lead to a Galilean tribunal. To my friend, data-driven meant collecting market data about needs, delving into that data through analytics and statistical processes, and then building product based on the data at hand. I pushed back on her approach. I’ve always preferred an ‘innovation driven – data validated’ approach to building new products – and I think it’s paid off.

Of course, any decision (including product design), should be based on the facts at hand. Here at Mersive, we’ve evolved to be very good at taking in facts and determining actions based on those facts (here’s a cool read to get you started if you’re interested in decision making with evidence). It’s not an accident that AI is making great progress in this area because decision making with data is low-hanging fruit when it comes to other forms of cognition. Humans are also very good at other things – generalization, inference, and creatively combining facts to create completely new constructs. Losing sight of these approaches to product design in favor of a completely data-driven process can lead to incrementalism and ultimately, a mundane product that might be well supported by data but ends-up leaving your customers nonplussed or even disappointed.

Here are three things that can happen if you fall into the data-driven trap:

  1. Overfitting. This term comes from mathematical regression and curve fitting. Imagine that you have a set of noisy data points that describe a 2D curve, x^2, and you’re asked to infer what is the underlying curve that created that set of points. No matter how noisy the points are, I could always draw a line that passes through all the points – a perfect fit. The problem is that in doing so I can no longer recognize that the data is generated by x^2.  I’ve lost sight of the underlying truth by over-analyzing the data. Now imagine designing a product based on user feedback data. I’ve seen data-driven discussions that fail to see what the underlying problem is that users want solved and instead focus on solving each of the data points independently -ouch! This usually leads to lots of corner case, feature dials, and complexity in the product that absolutely hits the market data, but ultimately fails.
  2. Following and Not Leading a Market. Starting from a data perspective (what ‘data-driven’ means) often encourages product development to first find what customers want (data collect) and then build a feature based on that data. This approach views product development as an engineering extension of market requests. I view product development (and a product road map) as a continuous conversation around customer problems and not features. Mining data produced by market analysts and even customer interviews can often lead to a focus on following and not leading. If a huge potential customer says you should build ‘x’ it’s hard to ignore that data, especially for startups who are hungry for revenue. I’ve seen this approach before – starting from data, typically focused on users requests and not underlying patterns, and it leads to products that are always behind. For new products in exciting markets, this can mean slow failure (Zune player anyone?).
  3. Looking in the Wrong Place. Data collection can be expensive – tagging, A/B tests, click-rates, analytics all take time to craft and collect. So it’s natural that data-driven decision makers focus their efforts somewhere. The challenge is that in deciding what data to collect and then analyze, an important constraint around your product’s future has just been established. I’ve also seen data-driven exercises focus on areas where data collection is both feasible and rich.
See also  Three Myths About Videoconferencing

—–

This reminded of a joke I heard in grad school…

A student walks out of a well-known Cambridge pub and is surprised to see a her mathematics Professor searching the ground around a lamppost in Inman Square.  The student asks, “What are you looking for?”

“My keys, I’m sure I dropped them as I left the pub,” the professor responds.

“But the bar is back that way,” she says, pointing into the distance.

“You’re right, but the light is best here.”

—–

An innovation-led process is actually similar to some of the data-driven approaches I’ve seen, but with a key difference. The Innovation-led process focuses on some simple principles:

  • Direct and ongoing customer dialogues that are focused on framing problems to be solved, and not on specific feature requests. Ultimately this should create a deep understanding of the underlying issues that lead to customers asking for features.
  • A creative phase that challenges the product designers to then provide an ideal solution. This step is really important, and demands a creative team whose explicit goal is to generate insights around the problem. Don’t be afraid of the word ‘creative’ here – after all, building new products is about creating new things.
  • Data-validation then takes place around the innovative solution. This principle relies on asking the right questions. For example, what are the assumptions that must be tested given the proposed approach? Are there barriers to adoption? If the data validates the approach – congratulations! You’ve created something new that broadly solves a deep need, but has a chance to surprise and delight your customers. If not, use what you learned from this process to continue the dialogue with your customers.

This process is how we developed a solution to digital annotation for meetings. Annotation systems are well established in the market, but just haven’t met the daily needs of users since the late ’90s. I hate to think about what this feature would have ended up being if we had followed a data-driven and not innovation-led process.

This blog was reprinted with permission from Mersive and originally appeared here.

Christopher Jaynes

About Christopher Jaynes

Christopher Jaynes is Chief Technology Officer for Mersive, a company he founded in 2004. Mersive’s visual computing software enables large enterprises, display manufacturers and resellers to create large-scale, beyond-HD displays that deliver unprecedented performance, simplicity and affordability. Prior to Mersive, Jaynes founded the Metaverse Lab at the University of Kentucky, recognized as one of the leading laboratories for computer vision and interactive media and dedicated to research related to video surveillance, human-computer interaction and display technologies. Jaynes received his doctoral degree at the University of Massachusetts, Amherst where he worked on camera calibration and aerial image interpretation technologies that were then used by the federal government. He received his BS degree with honors from the School of Computer Science at the University of Utah.