Situs Panduan dan Solusi Terkini

Executive Vice President of Smart Eye Rana El-Qalyouby talks about artificial intelligence in cars

  • Share
banner 468x60

Rana El Kalioubi co-founded and leads Boston-based startup Affectiva, which uses artificial intelligence and computer vision to analyze mood and emotion.

banner 336x280

Rana El Kalioubi co-founded and leads Boston-based startup Affectiva, which uses artificial intelligence and computer vision to analyze mood and emotion.

She has now taken on a new job as Executive Vice President of Smart Eye, after it bought Swedish eye-tracking company Affectiva for $73.5 million in June.

Al-Qalyubi says that this is just the beginning of what AI systems inside the car are heading for. This interview has been edited for length and clarity.

Q: Ten years from now a family is in a car. What can your technology do on their journey?

A: Well, the family is in the car. You have two children in the back seat. First of all, children quarrel. The car knows this and can see that the mother, who is driving, is feeling frustrated, a little crazy, and distracted. The vehicle intervenes by recommending content for children – or through a conversational interface, mediating a game between children. They play a little. fall asleep. The car can see this until the lights are dimmed and the music or movie is off. Then the car realizes her mom is exhausted and starts to fall asleep too, so she gets into this gossip mode to re-engage her. Then Mom gets out of the car, forgets the baby is there, and receives a text that says, “Oh, you may have forgotten Little Baby Joe!” I make this on the spot, and it can basically customize the entire cabin experience – music, lighting and temperature, based on knowledge of who’s inside the car and what they’re doing.

Q: What does Affectiva bring to Smart Eye and vice versa?

A: Smart Eye is a 22 year old company. What they have focused on in the past couple of years – and they are the undisputed market leaders – is driver monitoring. They are able to very accurately determine where a person is looking and they also monitor eye behaviour. They can identify when the driver is distracted or drowsy. They have been contracted by 13 global auto manufacturers. Affectiva originated from MIT 12 years ago and our focus is on humanizing technology by bringing emotional intelligence to machines. We anticipate that there will be an evolution in the driver’s monitoring of everything that happens inside the car. What are their moods and emotions? What activities do they participate in? They become the eyes and ears of the car.

Q: How do you find out someone’s mood or emotions?

A: We do a lot of face analysis but have expanded to do a lot of ‘key point’ tracking of the body so we can detect what people are actually doing – are you slacker in the car? Are you upset? We are watching all of that.

Q: What will someone’s face tell you that they are terrified?

A: There are expressions of fear. You can also start tracking other vital signs, such as heart rate or heart rate variability, and respiratory rate, via an optical sensor. This is the direction we are heading. It’s not at all ready for prime time but something Affectiva and Smart Eye are exploring. Once you know a person’s baseline, you can see if they are deviating from that baseline and the vehicle can report it.

Q: How do you guard against fears that you might misread someone’s feelings or moods based on race, gender, and neurodiversity?

A: This is one of the things Affectiva really brings to the table. It’s something we’ve been aiming very hard at. It starts with data diversity. If you are training an algorithm using middle-aged white men, this is what you will learn. The training package is essential and includes everything from racial and ethnic diversity to diversity in facial expressions – people may wear glasses, headscarves, or have beards. We partner with synthetic data companies to augment our data sets and fill gaps. The second thing, how do you check the accuracy of the algorithms? If you only look at high-level fidelity, it could be masking biases that exist in specific subpopulations. We dissect the data to ensure that no bias has infiltrated. And finally, team diversity is how we overcome these blind spots.

Q: What about the privacy of people who do not want to be analyzed or seen in the car?

A: In cars, the good news is that none of the data is logged. You do all the processing at once and deduce, for example, if the driver is drowsy. Hopefully the car will respond to keep the driver safe. I think there should be a lot of communication with the consumer and transparency about exactly what the sensor does. I imagine there will be scenarios where you can turn it off. But if it’s a safety issue, like your semi-autonomous car needs to know if you’re paying attention so you can move control back and forth, I imagine you might not be allowed to turn it off.


banner 336x280
banner 120x600
  • Share