AI-Powered Digital Storytelling Begins To Take Hold

FKT Magazin 10/2019
Technologien & Lösungen

Alot of frightful information has been distributed  about artificial intelligence (AI) taking over all aspects of our life, but when it starts to  replace equipment operators in production, then it really gets scary for everyone in the media & entertainment industry. That is, unless you begin to understand how AI can help digital production teams tell more comprehensive stories.

At this year’s U.S. Open tennis tournament in New York in early September, IBM’s Watson computer and its AI tools was used to produce fast turnaround highlight reels equally as good (and as fast) as any human EVS operator ever has. It also served up real-time stats and match analysis. And IBM is continuing to work with the United  States Tennis Association (USTSA, the body that hosts the annual tournament) to better develop their cloud-based AI offerings so that it’s highly scalable and highly flexible.

IBM engineers have been hard at work on the IBM Watson OpenScale app, which figures out the most emotion-packed moments on court  for highlight reels. This means that fans watching the men’s finals with Danlil Medvedev and Rafael Nadal, or watching highlights from the women’sfinals with Serena Williams and Bianca Andreescu could quickly view replays of the most exciting action from many angles, thanks to AI.

This AI-powered process is also helping coaches interact better with their players by  helping coaches assess an athlete’s mechanics and endurance. In the past, coaches have relied on player feedback and instinct during a tennis tournament  match that could last for several hours and require an athlete to run anywhere from one to six miles back and forth on a court. To better understand and quantify fatigue and energy, IBM has uncovered a new data set that incorporates a player’s physiological load and mechanical intensity. By pairing match video with this data, Coach

Advisor has the potential to completely change how coaches train, condition, and develop American professional and junior tennis players. John Kent, program manager at IBM sports and entertainment partnerships, said that IBM has worked with the USTA for nearly 30 years. They’ve  supported all of the USTA’s digital platforms, including usopen.org as well as all of the applications and all of the data that feeds those apps.

For this year’s event, AI Highlights have been enhanced with Acoustic Insights to improve sound analysis and has now been taught to recognize when the ball has been struck, allowing a tighter cropping of highlight clips. Additionally, using IBM Watson OpenScale, Watson can now recognize levels of noise and excitement levels of players, allowing it to remove bias when searching for highlights from players with a particularly popular following or those who are particularly animated on court.

Using IBM’s SlamTracker feature, fans could see scores, statistics and analysis of matches come to life in real time, but, for the first time this year, with an enhanced experience. By combining the momentum and live tabs into a single experience, fans got an improved and streamlined way to follow match analysis with tournament highlights, real-time updates and point-by-point commentary.

The way the AI highlights work is that Watson’s machine learning technology closely analyzes a match to understand player gestures,  those moments of excitement, the emotion in their face, the reaction of the crowd as well as the decimal level and the undulation. It can also understand the context of the match: Is this set point? Is this match point? What did that particular point mean? During the entire two-week tournament, there were more than 30,000 shots that were analyzed.

After a match (or segment of a match) is completed, a highlight reel was developed that showed not only the most exciting moments, but with the context of how the set point was shown, or how the match point was shown. And this all happens in mere minutes after the match is completed. In the AI Highlights Dashboard, also new for the 2019 U.S. Open, Watson automatically curated highlights of all men’s and women’s singles matches on the key courts, based on the excitement level of each point. The IBM team enriched the video content with meta-data associated with the players, the score, and other factors, and offered an extended playlists of shots to be included in the highlight package for review by the USTA editorial team. All of this aggregating typically takes less than five minutes to produce, rather than the hours required to create manually. IBM’s Kent said that AI is changing the way the USTA’s production team works because their editorial team oversees this. It’s man plus machine, he said, but now thee U STA team is able to do things at scale. They already had the capability to do editorial video production, but could they do  it for every player or nearly every match? With AI they can. It’s clear that, due to this desire for complex  sports analysis, computer-aided production is the way of the future.

Central to any sporting event is the fan experience. At the U.S. Open it’s digital and it’s driven by data – an experience created from the scores and sights and sounds that are the raw materials of how fans follow the action, both in person and across the world. The Open, like any big sporting event, produces massive amounts of that data, and it comes from many sources. Some are more obvious, like the chair umpire, who uses a device called a chum to record the scores, or the courtside statisticians who record match events like unforced errors, aces, and winners, or the radar guns that send speed readings to the U.S. Open control room.

But in the age of AI, the matches are also producing unstructured data – the visual and audio action of the U.S. Open captured through the live streams and devices. Some 370 hours of video  footage is produced over the course of the tournament, featuring over 31,000 data points, based on the 2018 scoring data. So the question is, what do you do with all this data? How do you make sense of it? How do you transform it into a meaningful experience for fans?

The answer is, with AI, and here’s nothing to fear.