Over the past few months we’ve worked with several companies by helping them incorporate customer feedback – to improve existing products, to help products reach new markets, or to develop new products. Some of you may have been contacted by us for interviews, and we thank you ALL for your help and support.
As we move forward, we would like to invite you to join our larger research community. From time to time, we might reach out and ask you to participate in 1-on-1interviews, online surveys and panels, or in-person focus groups. The topics could range from dating to magazine reading to buying cars or investing in the stock market.
If you’re interested – and we hope you are! – please sign up here http://goo.gl/c7ctyz and provide us with some basic information. If you fit the profile of someone who is right for an upcoming project, we’ll be in touch.
Farrah & Laila
TDE is working on a project to help create a new app specifically geared toward couples. We want to talk to couples of all stripes: young, old, dating, married, with kids, without kids, gay, straight, etc. to provide insight on how to make the experience better, more useful and fun.
Do you live in or near a major US metropolitan area?
Does one of you have a Gmail address?
Do both of you have smartphones (iPhone, Android, etc.)?
Are you both available to chat with us for 45 minutes at some point in the next two weeks?
If so, please send an email to email@example.com with the following information:
Names of both members of the couple
Your respective ages
Your zip code
We’ll then contact you to confirm some basic information, all of which will be kept in the strictest of confidence by The Difference Engine and will not be shared with our client.
If you are selected for an interview, each of you will receive a $50 gift card to thank you for your time once the interview is complete.
Thank you for your help!
This week I’ll be speaking at O’Reilly’s Strataconf in Santa Clara, both as part of the Data Driven Business Day on Tuesday, February 11 and as a keynote speaker on Wednesday, February 12. I’m excited to be part of the event, and thrilled to be working with Alistair Croll and his team as they put on what looks to be an event full of both big and useful ideas, and of some heavy-hitters when it comes to data and analytics.
So what’s a person like me, decidedly not a data scientist, doing there? What can I possibly have to tell a convening of many of the most data-oriented and data-savvy minds in technology and business?
Two things, really: numbers won’t count themselves, and people are data. Two lovely titles for talks, but what does it mean?
Those numbers won’t count themselves
I’m particularly excited about the Data Driven Business Day because I want to see how people define data-driven businesses. We are increasingly helping clients to ‘design for data’ - to anticipate and plan for better KPIs. We think whatever you choose to measure, it should help you make decisions.
I doubt very much that anyone really disagrees with that notion. But how we actually measure frequently tells another story.
Before starting The Difference Engine, we frequently encountered marketing clients who dealt in what the Lean Startup movement (specifically Eric Ries and Dave McClure) would call “vanity metrics” - metrics that make you look good, but don’t help you make decisions. These metrics might include a million likes on Facebook at one end of the spectrum, or “top 2 box on brand preference scores with competitor x” at another end.
We also frequently encountered brand managers who had very little data about their end users, their actual customers. Large consumer electronics brands often lack CRM systems that reach below the sales channel and so rely upon segmentation studies to guess about customers and prospects; networks and television shows mainly depend on Nielsen for viewer data but often don’t know who is really watching and why; B2B brands sometimes hold onto outdated beliefs about the motivations of small business owners or IT decision makers.
We used to belief this was simply because marketing is cut off from sales and operations, or that managers are tradition-bound to use tracking studies, segmentation, or attitudes & usage studies. There’s a lot that our clients do “because tradition”.
But as we work more with product managers, CEOs and even boards of directors, whether in established businesses or in startups, we find that the real tradition behind this dependency on 3rd party data sources is poor data design.
They don’t have direct data because they didn’t set themselves up to measure it. The consumer electronics company who does not sell direct to consumer can be (mostly) forgiven for not having a robust CRM system for analyzing end users, because they don’t directly transact with them (except over warranties or after-sales service).
But the digital marketing or product team is rarely similarly forgiven? Just the word “digital” seems to imply measurement - it’s numbers, after all. Everything can be counted now, so too often, digital marketers and product developers assume everything is being counted. But what’s worse is that they frequently aren’t sure what they should measure, how they should test, which customers they should care about, or what “conversion” even means for their business.
When you’re not sure what to count, you count likes. You count click-through rates. You count page views or time-on-site. You look at the numbers that are available. What gets counted is what counts, instead of the other way around.
Despite all the time marketers have spent measuring ad recall, brand like-ability, brand preference, intent to purchase, and despite all the effort spent haggling over reach, frequency and impressions, these numbers may not stand for anything meaningful to customers or their behavior. And what’s irrelevant to customer behavior is - most likely - irrelevant to business results.
People are data, too.
Businesses take comfort in numbers. Numbers seem dispassionate; numbers seem steady. The fickleness and fecklessness of people are scrubbed out of data. You can trust the data. The data will tell you what to do.
But that’s not really true. For most people, binders full of tabulated data will never translate, Neo-peering-into-the-Matrix-like, into action or intent. A survey will not “tell you what to do”. We need people to analyze the data, to interpret it, and to make recommendations based upon it.
Even more importantly, we need people who know how to measure the right things, who know how to design for better data. And to do that, we need to understand the underlying system of a market, a product or a customer segment. We need to not only know who people are by correlating their media habits with their brand preferences (a probably useful method of buying media); we need to know how people actually purchase products in the category, why they buy them, how they use them, and what else they use that we might see as ‘competition’. We need to know what people tend to do right before they cancel a service; we need to know what they tend to do right after they buy or sign-up or register or like or mention.
In short, quant research should never be undertaken in isolation. It should depend on a sound qualitative understanding of the world. Much like the natural philosophers-cum-scientists of the 19th century and before, we need great hypotheses to do great experiments.
When it comes to product strategy, we like to think of this as empathy building. We’re used to businesses taking an “if we build it they will come” approach to product and marketing development. When customers don’t, in fact, flock to the brand or the product, they try to optimize their existing behavior. This explains, at least in part, the rise of programmatic ad buying, analytics software, the popularity of dashboards and user surveys, and endless A/B testing. But where we think these tools fall short is in developing deep understanding and insight, in fostering empathy with customers, and in turn, helping managers develop better ‘instincts’ and ‘intuition’.
We’re all - marketers and product developers alike - trying to find the signal in the noise. Too often, we’re mistaking one for the other. But we don’t have to. We can get smarter about the underlying systems, measure better indicators, understand correlation and causation better. We can have more empathy for our customers. We can show better judgment, commit to our ideas, and make better decisions.
So how do we do it?
We believe in talking to people, but also in understanding the gaps between what people say they do and what they do. We believe in understanding the real life experience of a product or service or brand. So we watch, we listen, we investigate, we discover.
But these conversations are not enough. We help clients translate what we learn from real people into hypotheses we can test, and we collaborate with clients to tie these hypotheses to measurable actions. We set up clients with qualitative research that is part of the design process, not separate from it. We get the whole team involved in getting continuous customer feedback. We design research that helps us really understand and identify a problem, and we work with product and brand design teams to solve those problems. We help our clients develop a set of KPIs that are meant to drive action.
Based on understanding the business, the competitive environment, and the customer experience, we helped one of our clients make two key decisions: pay staff a little better to attract talent; give raises or bonuses to staff that get positive customer and management feedback. That’s it. It sounds simple, because it should be. And in our work with this client, we’d built enough trust in the process and the data that they made those decisions during our presentation to their board of directors.
We’re a product-strategy company. The strategy has to come from somewhere. We think it comes from empathy with customers and prospects, deep understanding of the underlying system that a business or brand lives in, great data design, and an eye for opportunity.
We say, let's be data pragmatists. Let's use data - in all its forms - to learn what to do next.