Usability has been pumped full of magic and taken to the top of the castle. Disney Research is using facial recognition data to see how audiences are reacting to their films. A recent paper published on the Disney Research website entitled “Factorized Variational Autoencoders for Modeling Audience Reactions to Movies” (say that three times fast) has the writers explain their system of tracking facial reactions to help quantify if a film is conveying the story it intended. Everything from laughs, cries, screams and the even smaller eye twitch, lip snarl, tiny grin, etc. are becoming data points to help the quants and filmmakers make decisions about their films during screenings.
How cool is this?
Another great example of how an organization is using data to improve the customer experience, or more importantly (from a shareholder perspective) maximize the effect of the film in order to draw the best possible reviews and subsequent audiences.
I have yet to see another organization get so granular with these details, however these micro-expressions the system can pick up is astounding and will provide a level of data that has yet to be leveraged in this industry.
I’d love to see this approach applied to the world of product development someday, while we have our tools to gather usability feedback, facial expressions can open a whole new world of possibilities for us. However as Mark Wilson over at FastCo pointed out, imagine a future where Disney can use this data in their parks, allowing the cast and characters react to your mood in real-time.
Now that’s magic.