Have you ever watched a sad movie on TV that was suddenly interrupted by an upbeat, loud ad? How did it make you feel? Did you suddenly find yourself switching gears emotionally? Did the ad seem jarring and inappropriate? Did you wind up resenting the advertiser?
There’s ample evidence to back up the belief that video advertising performs better when it aligns with the consumer’s mood. A 2015 report from Oxford University, for instance, showed that upbeat, cheerful ads that ran during a moment of tension during a movie made far less of an impact with consumers, leading to diminished brand recall and shorter viewing times. The swing in emotions causes viewers to enter a state of “deactiviation,” marked by lower physical and cognitive activity.
Unfortunately, few advertisers are taking context into account. But they have the power to, using contextual video advertising that is enhanced by cognitive capabilities. AI tools and precision targeting are allowing advertisers to better sync their ads with the surrounding content—and the viewer’s mood. Read on to get a sense of where this market is and could be headed. Also, be sure to check out the Outsmart your Video Competition with Watson white paper too for an idea on how IBM’s Watson will start to change this landscape.
Understand What Makes Context Effective
Ads generally have two functions: They try to create need and awareness or they are “directive” and help people find what they’re looking for.
“Context can turn a video into a directive form of marketing instead of a creative one,” adds Jeff Quipp, CEO of the Digital Agency Search Engine People. For instance, if you’re in the market for a boat and you see a video about things you need for your boat, “you’re not going to view it as an ad and distrust it, but as something that’s going to help you complete your task.”
That was the aim of Netflix’s Friends YouTube campaign. When the streaming service began airing Friends re-runs on its platform, the company pulled clips from the show to match whatever YouTubers were searching for. Type “cute cats,” for instance, and you’d be served a clip of the character Rachel with a hairless cat. Advertisements are effective when they fit into the “task” consumers are trying to complete thanks to the Zeigarnik effect, which posits that the brain won’t accept an uncompleted task, whether that’s learning something new or waiting to see how a plot will unfold.
“Contextual fluency,” meaning a message is consistent with the content surrounding it, also helps viewers process ads easier. During this year’s Super Bowl, Hyundai developed a real-time ad to connect with viewers. The ad featured servicemen on-base in Poland watching the game. The footage was filmed, cut and packaged during the game itself, and aired immediately afterwards.
“Everything you have to do in three months for a normal ad we did in about 48 minutes,” Eric Springer, chief creative officer at Innocean (the agency working with Hyundai on the ad) told AdWeek. The ad maintained theSuper Bowl context, while also creating a connection with fellow viewers around the emotional highs and lows of the game—with Hyundai’s brand front-and-center. Video ad tech company Unruly judged the Hyundai ad the “most effective” of all the ads this year.
Leverage Ad Targeting Tools
Creating ads in real time isn’t exactly a scalable way to reach viewers in their context. But as ad-targeting technology improves, so does the ability of advertisers to match context. Finding the right audience, for example, has improved from programmatic TV (where marketers buy ads based on content—e.g. for Game of Thrones viewers) to addressable TV. Addressable TV uses the same set-top-box data as programmatic, but allows brands to buy ads based specifically on the audience they want to reach, avoiding wasteful ad spend on viewers who watch a specific program but fall outside the ad’s target audience.
Online, brands can get even more specific with their advertising based on what’s taking place within each video. Netflix is known for creating about 77,000 “micro-genres” to describe its content, including “emotional fight-the-system documentaries” and “violent thrillers about cats” that offer some clue about the movies’ likely emotional impact. But artificial intelligence tools are making video content, known as “unstructured data,” easier to parse. Natural language processing, for example, is a type of machine learning that can comprehend “human” language (as opposed to programming languages), whether it be text in a Tweet or dialogue in a video. Marketers can also use facial recognition technology to judge emotional state based on images within a video.
With cognitive tools like these, ads can be catered more closely to match not only the topic and mood of a video—but also the social profile of the viewer.
“The holy grail for marketing is really to be able to target a market of one,” former IBM Watson Vice President Stephen Gold told the Wall Street Journal in an interview.
And no doubt consumers will find these targeted ads much less intrusive by fitting seamlessly into their viewing activities, whether that be searching for information or watching a TV series. It’s not only a win-win, it’s a new way of looking at advertising.
Do your online video ads strike the right tone? Learn more about how IBM Cloud Video’s next-gen tools can ensure your video ads complement what your viewers are watching on the screen and avoid your next holiday sales ad running during a horror movie.