November 2016. In This Issue:
Subscribe Contact Us

Fill the form below to join our mailing list.

1 + 6 =

Enter the sum

editor's CUT

Editor's Cut: Polls and polling data took center stage in the aftermath of the US elections.

Regardless of whom you “wanted to win,” the polls seemed to tell a vastly different tale pre-election than the actual election results. Prior to Tuesday evening, all but one poll run by the Los Angeles Times pointed fairly definitively to a Hillary Clinton presidency, so as millions awoke to a President-elect Trump, many were left to wonder what in the heck had just happened.

How could the polls get it so wrong? How could the forecasts for a win have been incorrect? Did the poll data lull supporters into not turning out to vote? Did the data empower supporters to vote for their underdog? Who was wrong…who could we blame? The American people? The reporters? The pollsters? Who???

At some point, my marketer hat ended up on my head, and suddenly, this question of “who made the data say what” took on a completely new perspective. At the CMO Council, on any given day, it is very likely that I am writing up survey questions or, alternatively, analyzing survey data or pouring over campaign data, trying to connect behavior to engagement scores. And on any given day, you are likely sitting in your office pouring over customer campaign, transaction and browsing behavior data, sifting through systems, spreadsheets and reports to formulate a picture of what the customer wants and what they intend to do. Not one of us has a perfect track record. In fact, there are times when we get it really wrong.

But there is one simple truth: It is as easy to read the data incorrectly as it is to read it correctly.

Let’s take click data as an example. For years, we looked at clicks as an affirmation of a campaign. If the creative and messaging were on point, a click was proof of a dollar well spent…that is, until marketers started to look one step beyond to see if that click resulted in a next action. We started to mature our view of the data, realizing that sometimes, customers click by accident, out of curiosity or even out of habit. But a click is just a click. It is the next action that became meaningful. This evolution of measurement was immortalized in the baby-click ad from Adobe a couple of years back (if you haven’t seen it, click here).

Data can be seen as a hard and fast truth, but how we process and interpret that data and turn it into insights and intelligence opens the door for all sorts of mistakes. This is why, as marketers, we have learned to ask the next question or gain additional perspectives in an effort to take a single action—one moment in time—and put that data into context. Consumers may say that they prefer the gold color of a new product, but if we don’t ask what price they are willing to pay or if they are actually in the market for the product at all, regardless of color, we can’t really be shocked when the gold, overpriced, out-of-demand product fails to fly off of the shelves. We have all learned the hard way that consumers don’t always tell us what they will actually do—rather, they will answer questions with what they aspire to do in a perfect world with infinite resources and possibilities. It is the action and reaction to something that can actually reveal context and let us in on the truth.

While the media focused on answers to questions about trustworthiness and temperament, nobody actually asked if these issues would really matter when ballots were submitted. It is a bit like only asking a consumer if they like a product, assuming that if we make it, they will buy it. It doesn’t provide the context of customer needs, their current browsing patterns or stated desires in customer-generated content like social media or reviews. It just asks, in a perfect world where money isn’t an option and you can have everything you desire, whether you like what I am showing you right now.

This election served as a stark reminder that single points of data are single moments in a customer’s mindset, but they can hardly be called definitive proof of customer intention and action. The survey can say one thing, but when bound to actions and context, we may find a roadmap to a completely different outcome. We can’t afford to analyze a single point of data in a vacuum. We will need to leave that to the pundits and news outlets for now. Instead, we have customers to understand and journeys to build.