Powering US Open 2017 Highlights, IBM Watson Impacts Live Broadcasting

When Juan Martin del Potro faced Dominic Thiem on Day 8 of the US Open, die-hard tennis lovers might have been excited, but it didn’t have the hallmarks of a “must see” event for casual fans. Few expected del Potro, ranked 24th, to advance.

But when he staged one of the best comebacks in US Open history, everyone wanted to see how it was done. And within minutes, they were able to, thanks to IBM Watson powering US Open 2017 highlights.

Watson assembled a clip reel within five minutes of the end of every match at this year’s Open, making highlights and key moments available to fans two to 10 hours more quickly than during previous years. The event marked the official launch of IBM Watson Media, a new business unit that leverages Watson’s leading AI capabilities to meet the future needs of broadcasters and their audiences.

With AI (artificial intelligence) highlight clipping, Watson watches and analyzes a video. Using a variety of APIs to identify key moments, Watson is then able to quickly assemble full highlight reels of live events. The functionality first appeared, in beta, during the 2017 Master’s golf tournament in April, showcasing dramatic moments from the four-day event.  

Equally as dramatic were the advances Watson’s machine learning made during the five months between the Master’s and tennis’ US Open.

“We pulled in a live satellite feed and then used things like the noise from the crowd, whether positive or negative, to determine what was an exciting moment or a less exciting moment,” says David Kulczar, senior offering manager at Watson Media. “We could look at when someone does a fist pump and is excited or when they’re happy and have a smile on their face. From there, we rank how exciting a clip was.”

A multitalented Watson improves the fan experience

Cognitive highlight clipping is a different sort of machine learning for Watson, which last year provided instant closed captioning (meaning it was able to understand nuances of tennis lingo, including ambiguous terms like “love” and “ace”).

To pull off its closed captioning and highlight clip-generating feats, Watson relies on sophisticated deep learning algorithms. Programmers essentially taught the system to learn the meaning behind expressions and body language. For example, if an athlete is smiling and raises his or her hands over their head, they’re likely excited. But if they grimace and cover their head, chances are they’re not.

In addition to analyzing 320 hours of video collected during the 2017 US Open to find the most exciting moments, Watson also ingested statistical information from courtside devices used to measure serve speed and ball position. Arming Watson with this information enabled things such as particularly powerful serves, which the audience might not have noticed, to still be flagged as moments to highlight.

Once the match was finished, Watson examined the individual match moments it had flagged, chose the best and created a video review of the tournament.

“It’s the next generation of the [sports] highlight reel,” says Pete Mastin, who directs Watson Media product marketing and market strategy. “Instead of taking a 20-second clip of a shot, what [Watson] did was create a 90-second summary of the match.”

From the court to the compliance office

But quickly creating highlight reels and captions for sporting events are just the start for how IBM Watson Media could transform broadcasting. Ultimately, Kulczar says, Watson Media could help broadcasters caption and clip their non-sports content — and even ensure that content meets local and national laws.

Countries have different rules when it comes to language and sexual content, which creates a challenge for networks that syndicate their shows internationally. Ensuring that each episode meets local standards is a task that chews significant time. Watson Media aims to automate the process, saving broadcasters money and time (and appeasing regulators).

Compliance might have the bigger impact, Kulczar concedes, but sports is more challenging —and, ultimately, more exciting—because time outs aren’t an option. Any capability Watson may inject into a game, whether it’s highlight clipping, instant closed captioning or some (as yet) unimagined duty, will have to be nearly instantaneous.

“Live sports are a tougher challenge, because you have to worry about inserting delay into the process,” says Kulczar. “We have to be sure our system performs well enough that it doesn’t introduce a delay for the broadcast partners or the event.”

To learn more about projected ROI outcomes from today’s most common AI applications in the video industry, download the latest IBM Watson Media paper, From AI to ROI: When Playback Means Payback.