There's already been plenty of pixel-inches written on the role that so-called 'big data' played in the Obama victory. The most fascinating of all of them was Michael Schrerer's behind-the-scenes access to the back-office data crunchers and analysts, which revealed (amongst a few other things):
- The sophisticated modelling to make the application of limited resource far more efficient (for example using many different data points from multiple sources to rank voter names in order of persuadability)
- The constant test and learn to build predictive profiles of which types of people would be persuaded by certain kinds of appeals.
- The intricate A/B testing behind a huge, metric-driven fundraising email campaign that helped Obama raise over $1Billion.
- The smart use of Facebook to drive mass scale advocacy. An example from the final weeks saw supporters who had downloaded an app being sent messages with pictures of their friends in swing states. By clicking a button they could urge those voters to vote, and about 1 in 5 people contacted in this way acted on the request.
- The use of voter profiling data to inform media buying, leading to TV spots being bought around non-traditional content in order to reach specific demographics, and to Obama's appearance on Reddit where a lot of their 'turnout targets' were.
Something not mentioned in Scherer's piece but which undoubtedly also played its part was the use of tracking and ad targeting technologies on the candidates websites. A report by Evidon (who produce the Ghostery app, and provide analysis of the deployment of third-party tracking technologies and the so-called 'invisible web') identified no less than 97 unique tracking technologies that had been deployed across the two candidates sites (far higher than the average site employs). Tags included those from analytics software, social plug-ins, and ones from ad networks that might be used for re-targeting purposes (to serve advertising to identfied users elsewhere at a future point) or else to track the conversion of users elsewhere and attribute value back to the website visit.
But Scherer highlighted something else that was fundamentally critical: joined-up data. In 2008, the campaign was hampered by multiple databases which meant different lists, unreconciled activity and interaction between centralised and localised resource ("It was like the FBI and the CIA before 9/11: the two camps never shared data"). So for the 2012 campaign data was unified into a huge system that could merge information from all touchpoints and data sources – website, CRM, field-workers, consumer databases, phone calls, social media, voting history data, mobile.
Much of the effectiveness of the campaign's use of data would not have been possible without a unified data source. I've just completed some research work for a client in the area of multichannel marketing, and joining up disparate data sources to create a single customer view is something of a holy grail for many clients right now. The reason is that it improves the effectiveness of not just one channel, or a couple, but of everything you do. If you're able to profile your customers and prospects better, and have a more rounded and accessible history of all of their interactions with your company, your customer service team have more context which they can use to be more helpful, your email campaigns can be made more relevant by understanding whether the person your sending it to has clicked on one of your ads, or visited your website, or prefers to interact through mobile.
There are many areas of challenge around this of-course, not least from unifying data collected in siloes, from multiple legacy software systems that are used by multiple departments that don't talk to each other. Then there's the difficulty in attracting smart data analysts that can derive actionable insights from raw numbers, something which is only going to get more difficult as many more companies upskill in this area and demand outstrips supply. And then there's the challenge of developing sophisticated attribution models that might effectively attribute value back to individual touchpoints within customer journeys, data which in aggregate might be used to change media investment on the fly to optimise conversion or action or interaction. And then there's the challenge of unifying this and other internal client data with that collected and utilised by media agencies so that a true end-to-end picture might be developed. But make no mistake, this will happen. The advantage is too great for it not to.
Felix Salmon drew a comparison between the data-intelligence described in Scherer's piece and that shown by Nate Silver, whose data modelling enabling him to successfully predict the outcome of the election in a way (and to a level of accuracy) that was unprecedented, and garner huge traffic and talkability for the New York Times. The reason his work became such a phenomenon (as Noah pointed out) was because he could explain complicated maths in simple terms, and he gave people fodder for conversation. The success of both Nate Silver's prediction and the Obama campaign, said Salmon, is about marrying quantitative skills with storytelling, to unbeatable effect:
"The thing that Silver and the Obama campaign have in common, then, is that they used their databases to tell stories. Or, more to the point, their databases and models were used so that Americans could tell stories to each other."
And that's the point. Joined-up data that enables you to understand which stories to tell to who, and how and when they should be told. That's the future of marketing, right there.
Leave a Reply