Everyone is looking for the next super stat or mind-boggling insight, and you can’t blame them. In every sector, disruptors are appearing with a more customer-centric approaches. But what sets the best advertisers and agencies apart? As Ben Royce, Google's Head of Performance Data, explains, It is not the quality of insight, but how fast they get there.
Thanks to leaps in computing power, cloud based analytics, and data science techniques, advertisers and agencies are building systems that generate marketing insights to supplement their research programs. 60% of enterprises1 are in the process of adopting AI. Call it the “disruption of insight” if you like. In this article, we'll share a number of developments we are seeing in the area of AI and marketing that enable this disruption.
While machines cannot feel emotion, they can detect it, as long as they have properly trained models. With techniques like embedding, machines can understand the emotional signals coming from text based on previous training. This isn’t new, but in these models, emotion labels are in relation to each other. For example, a machine will not know how sadness feels, but it does know it is close to bittersweet and far away from happy. There are countless applications for this in marketing, from sentiment analysis on social or in press, to customer service. For instance, a practical use-case would be the prioritisation of incoming customer emails based on emotion. This leads to faster insight into the customers who need the most help through routing an emotional customer to the agent who handles that emotion the best.
The science of software reading our writing is called natural language processing and it is bearing some reliable fruit thanks to machine learning. This kind of text mining has been used for years, but the accuracy and speed at which we can do it has reached new heights. It can now be measured in milliseconds, not days.
One area where this new ability is paying dividends is in video comments. A single YouTube video can get thousands of comments, making it hard for humans to interpret sentiment. The BrandUnit, Google’s creative think tank in the U.S., uses Google Cloud Natural Language API on YouTube comments at mass scale to determine how users are reacting to brand’s content.
For example, Dunkin Donuts recently worked with a YouTube Creator called Miranda Sings to create a custom video to promote its Donut Fries launch, earning them more than 6,000 comments. By working with Google's BrandUnit, they were able to establish a sentiment score for the video and compare it to all other videos analysed - discovering it got the most positive score the team had ever seen. The time to that insight: 28 minutes.
If 28 minutes is too long to wait, you can do it live using that API and Google Firebase. AI with live Twitter feeds can do real time Twitter sentiment analysis2. Imagine that: no post event report, no delay, just live insight through machine learning.
Machines that can watch video
“What if you could see what parts of the videos led to better retention? This is all possible at enormous scale through advancements in computer vision.”
- Adam Fahrenkopf, Technical Program Manager, Google Assistant
The cutting edge of machine learning and insight generation is video content analysis. Here, machines are trained to “watch” videos and acquire a deeper understanding of what they're actually seeing. Right now the technology is in its early stages, showing an ability to point out the obvious objects on screen. But as we do more research, we will be able to detect story arcs, narratives, and other more complex concepts.
This doesn't necessarily lead to insight though. It is only once you mix it with something else that it becomes novel and actionable. In recent tests, Google researchers found correlations between skip rates on YouTube Trueview ads and what appeared on-screen. This means that at mass scale, we can begin to identify the patterns that have made great ads and advise on how to improve or test new ideas in the future. With this technology, the analysis of 10,000 YouTube ads can take mere minutes reducing the feedback loop.
For example, in lipstick ads we found that lip brushes retain viewers better than lip pencils. In beauty ads on average, brown and black haired women held viewers better than blonde. In the obvious category we found that models and supermodels are effective. This begged the question: To a machine, what is the difference between a model and supermodel? When we asked the best beauty brands in the world and they speculated on facial symmetry, or perhaps lighting.The answer is simpler than most people think: Wind in her hair. That’s right, just turn on a fan and you can go from model to supermodel. That insight took less than 10 minutes to find.
It is still in its early days, but in the coming months and years, the best marketers and advertisers won't just be generating insight, they'll be generating insight faster with AI. Which means marketers can test and iterate faster as well. In order to do that, advertisers and agencies need to plan for multiple versions of creative early in the campaign and set aside budget and time for side-by-side testing3; that way, brands can benefit from the insights AI can produce even faster.