A few years ago, XAPP partnered with Cumulus Media, and Westwood One Sports to create an experience for Amazon Alexa users. We started with a simple concept: ‘Let fans listen to live game feeds for NCAA Women’s, and Men’s basketball games during March Madness from a smart speaker’.
That original idea has evolved each subsequent season, making it very complex behind the scenes, while keeping the experience simple and natural for the user.
In this episode, we discuss some of the advanced AI interactions we’re using to drive the March Madness skill on Amazon Alexa. Besides great content that user’s can access from one of the nation’s largest broadcasters, there are some sophisticated features invisible to the user, yet essential. We dig into some of those topics.
Handling multiple streams during the tournament.
With so many games playing simultaneously, there can be as many as 6-8 live streams going on at the same time. XAPP spent a great deal of time making it flexible for user’s to request the stream they want. There can be multiple ways to ask to play a game. For example, ‘Wolverines Basketball’, ‘Michigan’, or ‘University of Michigan’ should all lead to the same destination. Thought and care has been put into the AI training data to enable this.
Dealing with disambiguation
When building an AI driven experience, it is important to understand the concept of ‘signal’. This can be input from the user, and it’s inherently ambiguous. Why? Because what a user says, can mean many things depending on factors like time, location, and context.
Watch the video to see how we deal with conflicts like requesting a school with Men, and Women teams participating in the tournament at the same time.