Thursday July 18th, 2024
Download The SceneNow App

There’s An App That Can Read Your Emotions through AI – And It Was Created by an Egyptian

Sitting at a panel at the Spark Entrepreneurship Camp, all the way from the USA, we met Rana El Kaliouby, the Egyptian scientist pushing the boundaries of artificial intelligence by deciphering people´s emotions in real-time -through a webcam.

Staff Writer

We're sitting at the Spark Entrepreneurship conference as Rana El Kaliouby begins to explain via Skype her journey into the realm of Artificial Intelligence. As she speaks through a webcam, her face is framed by a series of codes and smileys revealing the emotions our interpretive powers sometimes fail to perceive. It’s some sort of meta-communication: she is speaking about the very inception of this software as we experience the astonishing, perplexing, and often spine-chilling extent artificial intelligence has become part of our lives firsthand.

Imagine a robot or Siri being able to respond to your emotional state in real time: that’s what her software, Affdex, enables. And its applications are boundless: it has enabled the creation of wearable devices for autistic children, tracking systems for marketers to monitor how audiences respond to ads, and even in the gaming sphere, where it is used understand the player’s emotional journey to adapt the game. “In some games, the more scared you are, the harder the game it gets,” she explains.

The science of facial emotion dates back 200 years when facial expressions were studied by electrically stimulating muscles. “In the 1970s, a study called the Facial Action Coding System found that the face has 45 muscles, so we created a code for all the different facial expressions that you can do,” El Kaliouby explains. Certified face readers, then, could codify people’s emotions after a 200-hour training, but the approach is not scalable: watching one minute of video to code it would take five minutes.  

“What we did is to use computer vision, machine learning, and AI to automate this process. Our software quantifies 20 different facial expressions, which are matched to emotional states,” she says, as she shows a software demo where expressions are matched with equivalent ‘smileys’, whether disgusted, surprised, depressed, or even a Kim Kardashian-like face. So far, El Kaliouby's company, Affectiva, has amassed a data repository of 4.25 million videos from people in 75 different countries.

The software is so meticulous that it manages to distinguish, for example, a smile and a smirk. “By providing the machine with tens of thousands of images of people smiling and others of smirking, the computer uses deep learning –what Google and Facebook use to learn who you are in pictures- to quantify that information so that next time you show an image of someone who it’s never seen before, it can recognize the emotion,” she explains.

Going over the lessons learned in her entrepreneurial journey, El Kaliouby says picking the one thing that you become the world’s expert in is key. “I feel that has helped us a lot as we expanded and raised money for the company. People know that we have the best core science because we’ve been doing it for so long and we are recognized as world leaders,” she says.

Last May, her company raised $14 million in investment, led by Fenox Venture Capital, reaching $34 million in venture capital to date, including a $20 million round in earlier venture funding from investors like WPP, Li Ka-shing’s Horizons Ventures and KPCB, an outcome she had never expected 12 years earlier, when she finished her postgraduate studies at Cambridge University and headed back to her hometown Cairo.

It was toward the end her PhD studies in 2004, that an MIT professor invited her to join the MIT. “I had a husband back home, so for a few years, I had to commute back and forth to Cairo,” she recalls.

Together with the professor, El Kaliouby decided to focus on one application of the technology she was developing as part of her PhD: autism. “Individuals who have autism have difficulty with emotion-regulating and really struggle understanding people’s emotions and their facial expressions, that’s why they usually avoid eye contact, staring at the ceiling for example, because for them it’s overwhelming to see all these facial movements and track them in real time,” she says.

By 2006, El Kaliouby had developed wearable devices for autism, which she tried with a school for autistic kids. But it was at the MIT sponsorship day where the path to entrepreneurship began to take shape. “The MIT media lab, which is sponsored mostly by industry, twice a year hosts industry sponsors for a demo day. For three years in a row, we’d get these amazing companies saying ‘we actually want your tech to test my product. Bank of America, for example, were interested in finding out people’s response when they interact with ATMs; Microsoft was interested in education and Disney was interested in Robotics,” she recalls. By the end of 2008, plenty of companies were striving to license her technology.  

“My first reaction was: ‘I’m not a business person!’ but then I realized that by starting a business we have a unique opportunity to take a core technology and bring that to the market, changing the way people do things on a daily basis. That was the tipping point,” she emphasizes.

Her company Affectiva was thus born in 2009, aiming to be the emotional AI platform and driving innovation across a myriad of industries such as media and advertising, gaming, online learning and health.

“For online education, example, a smart teacher would adapt content to the student’s emotional response to create more engaging learning experiences. Social robotics is also a big area. I just got back from Tokyo, where I found social robots in airports and hotel concierges who guide you and walk you up to the room. Right now, they are very rudimentary, because they don´t understand how you react; so our technology could really transform them,” she says.

But aside from being an avant-garde business, Affectiva boasts of the world's largest emotion database, comprising five million faces from 75 countries that have been analyzed. “So far, we’ve tested people’s responses to 22,000 ads; allowing advertisers to test the response to their ad and track moment by moment how people react, in order to fine tune it. This is fascinating because people think emotions are something really personal, instead, we are quantifying them,” she says.

Check out Affectiva's website for more information, and try Affdex to get a glimpse of emotional AI.


Be the first to know


The SceneNow App