We recently spoke with emotion artificial intelligence (AI) company, Affectiva, about how Emotion AI is emerging as a new opportunity for developers. Affectiva recently announced that its emotion recognition SDK has been integrated with Unity. Developers in a range of industries from gaming, education, robotics and healthcare, to experiential marketing and more are already using Affectiva’s software. Now they can infuse Emotion AI into their apps and digital experiences for more dynamic and authentic interactions that sense and adapt to human emotion.
Boisy Pitre, Emotion AI Evangelist at Affectiva, shared his perspective on how real-time emotion awareness is the next frontier in humanizing technology, offering examples of how developers can tap Affectiva’s SDKs to create adaptive games, apps and experiences.
ADM: Tell us a bit about Affectiva, and the company’s SDKsPitre: We are an MIT Media Lab spin-off and consider ourselves the pioneers of Emotion AI. We coined the term and are now defining this category, which is essentially the next frontier of artificial intelligence.We live in a world full of hyper-connected devices, smart technology, advanced AI systems -- lots of IQ, but no EQ -- these technologies are not able to adapt to human emotion. I think that is a problem: it not only affects how we interact with technology, but also how humans communicate with each other as more and more of these interactions take place in a digital context. Consequently, we are on a mission to bring emotional intelligence to the digital world, thereby humanizing technology.
Our software measures emotions through facial expressions, unobtrusively, using a standard webcam or device camera. We first got started in media and advertising, where over 1,400 brands in 75 countries use our cloud-based product to measure consumer response to brand content such as videos and ads. But in recent years we have rapidly expanded into other verticals. Now that we have SDKs across a broad range of platforms, developers can license our technology and integrate emotion recognition in whatever they are building: mobile apps, games, devices, enterprise applications, digital experiences etc., so that these can respond to human emotion in real time.
ADM: Talk a bit about the catalyst for adapting Affectiva’s SDK for Unity. Prior to having Emotion AI, how did game developers achieve adaptive gaming experiences?Pitre: Prior to our Emotion AI solution which requires only a webcam, it was actually quite challenging for game developers to create games that dynamically adapted to human emotions. To capture biofeedback, players had to use wearable devices and sensors, making game much less enjoyable and more expensive to play. Because our technology integrates into a game so easily and unobtrusively, these games can now truly captivate users and provide an all encompassing and adaptive gaming experience.
Other forms of adaptive gaming can be seen in more traditional, brute force ways. In these scenarios, game designers have to rely on broad cues to determine that the user is having a difficult time. For example, in some games if you take too long to do a certain task, or you die too many times at one juncture, it will make it easier for you to advance. While these are adaptive experiences, they are usually a bit clumsy and easy to notice. With Emotion AI, the game designer can seamlessly move the player along and provide just the right level of challenge, neither frustrating nor boring them.
So why did we come out with our Unity plug-in? We already had game developers using our Android and iOS SDKs to build mobile games, such as Flinch. However, we received many requests for a Unity plug-in from developers building in Unity 3D. Interestingly, these requests were not only from game developers, but also from creative agencies who are building unique consumer experiences and interactive advertising for their clients. Because Unity is such a prominent development platform, we decided quite quickly that we had to support it. Our Unity plug-in is now available for Mac OS X, Windows, Android and iOS.
ADM: How are game developers already using Affectiva’s Unity plug-in? Are they seeing positive/encouraging responses from players so far?Pitre: There are a number of projects and experiments going on right now. Some of the development cycles are quite long in the gaming world and we cannot talk much about specifics until these games are released. However, we have some major companies, like BANDAI NAMCO, looking into our technology.
There are also a number of smaller studios using our SDK. Flying Mollusk created the Nevermind game, a biofeedback thriller available on Steam. Using our Unity plug-in, Nevermind can sense a player’s facial expressions for signs of emotional distress -- the game becomes more surreal and challenging when players feel scared or nervous. Emotion Hero is an Android game that uses our Unity plug-in. The game SYNC uses it as well. In fact, SYNC won the “Excellence in Innovative Narrative” award at RPI’s 2016 game fest and is downloadable from itch.io.
ADM: Beyond gaming, what are some of the other use cases or applications for Affectiva’s SDK? How are other industries approaching Emotion AI in general?Pitre: We’re seeing interest all across the board! One particularly compelling area is social robotics. The space is exploding with a whole host of entrants bringing their own vision and ideas to market. By nature, these products are designed to interact with humans to perform any number of tasks, or maybe just for companionship. Another area where we are seeing heavy interest in Emotion AI is education. Emotional health and well-being can contribute to a better learning experience, and using Emotion AI can both reinforce learning and help detect issues in the learning process before they become major problems. An interesting education app is LittleDragon -they just launched their IndieGoGo campaign. Little Dragon is a language tutoring app that adapts in real time to the emotions of the student, providing a very personalized and happier learning experience. These are just two specific areas where Emotion AI is having an impact, and we are seeing heavy interest from a whole host of verticals. Everyone is racing to figure out how to make the best use of this technology.
ADM: What are the benefits of using Emotion AI for developers? For end-users?Pitre: For a certain class of apps, Emotion AI adds an additional layer of depth that can bring added value to developers. Since the deployment of emotion-enabled apps is a relatively new frontier, there’s a ton of opportunity for developers to find new ways to apply this technology to help their users, and also to increase app adoption. For the end-user to feel comfortable with the technology, they will have to see value. I certainly think the value is there -- from helping you understand yourself better to increasing successful interactions with others.
What’s exciting is that there are already developers actively working to crack this nut and deliver on the promise of the technology. In fact, we are organizing our first-ever Emotion AI Developer Day on November 16. This online event will allow developers to get a first-hand look at how easy it is to integrate Emotion AI into their apps. They will also hear from other developers on how they have accomplished their integrations. It’s totally free, and I encourage developers to register here.
ADM: Are there any industries that you feel will benefit most from emotion-aware technology?Pitre: While there are many industries that can gain benefit, automotive is an interesting one. I see cars that will use emotion recognition technology for safety features that can alert for driver distraction and road-rage. But there is also the driverless car, an advanced AI system that provides a safe, comfortable and engaging experience -- of course it needs to adapt its operations to the emotions of the people in it. Secondly, I feel passionate about how Emotion AI can help in consumer health. From researchers and clinicians looking to treat mental health afflictions such as depression and anxiety, to pharmaceuticals doing drug efficacy testing, to telemedicine. For example, using Google Glass and our SDK, BrainPower has developed programs for people on the autism spectrum which help them to read and understand emotions of others.
ADM: How do you see mobile and wearables impacting the emotion AI space?Pitre: There’s no question that wearables and mobile are an exploding category. We’re now nearly a decade into carrying ever-increasing intelligent smart phones in our pockets, and just a couple of years into having intelligent timepieces on our wrists. These devices are always collecting data about our environment, our location, and our interactions with others. The interesting thing about Emotion AI is that it can use this additional data to provide context and help quantify our emotional states. For example, imagine an Emotion AI model that can use ancillary information, such as a social media interaction, or the weather, to help you better understand why you were having a bad day. Did that latté that you drank at the coffee shop help you feel more ready to take on the day?
ADM: What do you see as the next big step in Emotion AI development, and its impact on brands or users? What do you see as the tipping point for Emotion AI to break into the mainstream?Pitre: I believe the next big step in Emotion AI is the ever increasing gains being made in deep learning systems that will incorporate ever increasing amounts of data, and more interesting and disparate types of data to create new products and services. Emotion AI will help fuel growth and awareness in these new services. As people become more and more comfortable with having their emotions being part of their daily data input process, I think the technology will eventually become second nature, and we’ll wonder how we ever lived without it.
Editors note: As Affectiva's Emotion AI Evangelist, Boisy advocates and promotes the adoption of the company’s emotion-based artificial intelligence technology across all of its supported platforms. He developed Affectiva's SDK that developers can use to emotion-enable their apps and digital experiences. Having delivered the only emotion-sensing and analytics SDK for mobile devices that is available today, Boisy has pushed the boundaries of innovation in emotion tech and is now focused on continuously advancing Affectiva's technology on mobile platforms such as iOS and Android. Boisy holds a Master of Science in Computer Science from the University of Louisiana at Lafayette, and is currently working on his PhD in computer science.