If you’ve ever watched Big Hero 6 — and if you haven’t, you’re missing out — you’ve probably uttered the words, “I wish Baymax was real.” I know I have.
From the moment Tadashi introduces Hiro to the lovable nurse robot, fans of the 2014 movie were smitten. Big Hero 6 is arguably underrepresented in Disney Parks around the world.
Only Tokyo Disneyland — the film was particularly popular in Japan, undoubtedly due, in part, to the Asian-American main character and San Fransokyo setting — currently has an attraction dedicated to the movie. The Happy Ride With Baymax just opened in September 2020 and is officially described as the following:
Experience this wild musical ride developed by the young inventor Hiro Hamada. As up-tempo music plays, this ride is sure to make everyone happy.
As you can see in this photo of the attraction [below], every party of Guests who rides gets their own personal Baymax figure attached to their ride vehicle. These versions of Baymax, however, are not animatronic.
But, could Disney possibly be developing it’s very own Baymax robot? An exciting Disney research & development project indicates it might be a possibility!
The Disney Research website published a new study in October — it is entitled Dynamic Emotional Language Adaptation in Multiparty Interactions with Agents. The paper was presented at the International Conference on Intelligent Virtual Agents (IVA) 2020 and written by Bahar Irfan (Disney Research/University Of Plymouth, UK), Anika Narayanan (Disney Research), and James Kennedy (Disney Research.).
In order to achieve more believable interactions with artificial agents, there is a need to produce dialogue that is not only relevant, but also emotionally appropriate and consistent. This paper presents a comprehensive system that models the emotional state of users and an agent to dynamically adapt dialogue utterance selection. A Partially Observable Markov Decision Process (POMDP) with an online solver is used to model user reactions in real-time. The model decides the emotional content of the next utterance based on the rewards from the users and the agent. The previous approaches are extended through jointly modeling the user and agent emotions, maintaining this model over time with a memory, and enabling interactions with multiple users. A proof of concept user study is used to demonstrate that the system can deliver and maintain distinct agent personalities during multiparty interactions.
The most intriguing thing about this artificial intelligence research is that it is focused on allowing the model to determine “the emotional content of the next utterance based on the rewards from the users and the agent.” This is almost identical to what Tadashi did when he developed Baymax.
Furthermore, Disney’s research paper notes that “A proof of concept user study is used to demonstrate that the system can deliver and maintain distinct agent personalities during multiparty interactions. This sounds amazingly similar to the way in which Baymax interacts differently with every character in the Big Hero 6 crew — Honey Lemon, Hiro, GoGo, Fred, and Wasabi.