The Seattle Seahawks are using Amazon Bedrock, generative AI and other AWS services to distribute video and content faster with a focus on quick returns as well as long-tail opportunities. Here's a look at the project and lessons learned so far.
The generative AI project, which started with this season, is part of a multi-year extension between the Seahawks and AWS, the team's official cloud, machine, AI and generative AI provider. Under the deal, the Seahawks are automating content distribution as well as transcribing, summarizing and distributing press conferences across multiple channels and languages.
AWS and the Seahawks will also integrate generative AI throughout business operations. The Seahawks and AWS first partnered in 2019 on NFL Next Gen Stats, insights on player health, performance and scouting. Lumen Field, home of the Seahawks, is also a showcase for Amazon's Just Walk Out technology.
I caught up with Kenton Olson, Seattle Seahawks Vice President of Digital & Emerging Media, to walk through the generative AI content project and what's next.
The project. Olson said the Seahawks will publish more than 1,000 videos throughout the year. The goal was to speed up the time it takes to get videos from the creation team and editors to production. After the 2023 season, the Seahawks looked to accelerate the process with Olson's content team of 11, which focuses on digital content and platforms.
"We use multiple AWS products for everything from encoding and transcribing video to hosting," said Olson. "We were excited to use Amazon Bedrock to provide some automation to the videos we're shooting to save time and get stuff out faster."
For now, the Seahawks are focused on media availability of videos and press conferences. The Seahawks will do about 300 press conferences throughout the year with plays and coaches.
In the future, Olson said generative AI will provide an assist for podcasting and entire video workflow. "As we move forward, we'll train the AI and make sure we tune it because every video is a little bit different," said Olson. "We started with press conferences and are learning."
The process before and after. Olson said the previous process took about 45 minutes to an hour to create a video to publishing and streaming to various channels. The video processing and publishing process had 60 steps. "We're now in a situation where once the video is submitted it's published in about 10 minutes in a worst-case scenario," said Olson. "That includes things like translating and providing a summary that would have taken us hours before."
Returns on investment. Olson said the initial return is time saved that frees his team up to think of new types of content to create. "We'd like our people thinking of new types of content not necessarily pushing buttons to publish something," said Olson.
By the end of the season, Olson expects save hours "so that our content creators can focus on creating other things for our fans and exposing that content."
Another early return is that the Seahawks can provide more in-depth information with generative AI summaries that can give fans more opportunities to discover content in unique ways. "We're also excited to see how our search engine referrals and various components are improved by providing more rich metadata," said Olson.
Longer term, Olson said generative AI can boost the returns of the Seahawks video archive, which will be critical since the franchise will soon enter its 50th season. Olson said:
"We have done a good amount of work over the past couple of years to take old Betamax tapes off the shelf and digitize those. We don't have a lot of real good data on all those, and so we're working with AWS right now to figure out how to process them and get a lot more data about who's in the video and what did they talked about. In the future, we could say here's a Jim Zorn video of him talking about something and do it within seconds. Today that would be a lot of manual scrubbing. As we move forward, there will be opportunities to talk about our history."
More from the genAI field:
- Enterprises leading with AI plan next genAI, agentic AI phases
- How GE Healthcare is approaching generative AI, LLMs, and transformation
- Intuit embraces LLM choice for multiple use cases
- 13 artificial intelligence takeaways from Constellation Research’s AI Forum
- BioNtech, InstaDeep bet on genAI models to advance R&D, drug discovery, cancer treatment
- Enterprises start to harvest AI-driven exponential efficiency efforts
Model choices. Olson said the plan from the beginning was to test multiple models and analyze them based on quality of output without human intervention. The Seahawks have already swapped a few models based on use cases. Olson noted that his team has swapped models out as the company moved from pilot to production. "It definitely took us some tinkering to understand what model makes sense and which doesn't. The tremendous thing about Bedrock is that we can use many different models," he said. "When we built this process, we knew these models are all changing. The model we're using now is really great, but for all we know there's some model in six or seven months that we'll want to move to."
Humans in the loop. Olson said the primary goal of the genAI project was to focus his team on more content and new ideas. The process for the video team is to bookend video production with human oversight. At the end of the process, humans make the quality checks and decide to publish, but the models have gotten to the point where "we're hitting publish more than having to make edits," said Olson.
Olson's team also had to give models unique spellings of names as well as new players as roster changes are daily and weekly. "We really work on ingesting our roster before every video to make sure the latest players are there," he said. Today, the models get an update every time there's a roster change.
What's more intriguing to Olson is the human input at the front end of the process. He said that generative AI speeds up ideation and allows creators to try new things in seconds and iterate from there.
The project timeline. Olson said the Seahawks started on the genAI project in the late spring with building components. By time, the season started the Seahawks were ready to go. "It took us about two months of adding pieces and iterating to make sure we could move forward," said Olson. "It was more about adjusting the model to fit our needs and making sure we use it in the correct way."