Can Technology Read Your Emotions?

[ad_1]

The basic idea behind emotion recognition technology isn’t terribly complicated.

Clues about your emotions come from facial and body expressions, the tone of your voice, the words you speak, and biometric data such as your pulse and heart rate. To get this information, the technology relies on microphones, cameras, sensors in various places (such as cars), and devices such as smartphones.

Sometimes the signal may be fairly obvious: A louder voice might indicate anger, distress, or excitement. For example, the Amazon Halo analyzes positivity through the user’s voice.  Cars typically detect fatigue by studying driving patterns, although more elaborate systems of the future may also monitor a wider array of inputs, including clues from the driver’s eyes, facial expressions, and heart rate, as well as signals from the car itself.

If the basic concept is straightforward, accurately interpreting signals from diverse people and cultures poses huge programming challenges. Just consider how difficult it is to understand what someone else’s raised eyebrow or sly smile might actually mean.

Experts warn that systems might show bias because they’re not able to properly factor in racial, gender, and other differences. For example, the author of a 2018 research paper used Microsoft AI and Face++ emotion recognition technology to study the portraits of more than 400 NBA basketball players posted on the site basketball-reference.com. Microsoft registered the Black players as more contemptuous than the white players, and Face++, which has been developed by a China-based company, found the Black players generally angrier.

For all the passionate discussion about the potential of AI, some top experts doubt the technology will ever interpret emotions accurately. Obstacles include differences between people, cultures, and circumstances. Machines may just not be able to pick up on the subtleties.

“When you don’t have the context, it isn’t going to be possible to accurately place that emotion in a bucket,” says Rita Singh, an associate research professor at Carnegie Mellon University and author of the book “Profiling Humans from their Voice.” “Machines cannot fill in the gaps like humans can.”

She says that in most cases a salesperson will be more effective than a machine. She gives the example of a salesperson who learns what Singh does for a living. “The bottom line is, I am a professor, I have money, and I can spend it, right?” she says. “In many contexts, what you say is more important than how you say it. So the underlying emotion may not even be important.”

The potential for misinterpretation and abuse has prompted some experts to seek limits on emotion recognition, and the AI Now Institute at New York University calls for an outright ban. 

Some industry leaders, such as el Kaliouby, who is the author of “Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity by Bringing Emotional Intelligence to Technology,” advocate regulation to prevent abuses.

“There’s incredible opportunity to do good in the world, like with autism, with automotive, our cars being safer, with mental health applications,” she says. “But let’s not be naive, right? Let’s acknowledge that this could also be used to discriminate against people. And let’s make sure we push for thoughtful regulation, but also, as entrepreneurs and as business leaders, that we guard against these cases.”

[ad_2]

Read More:Can Technology Read Your Emotions?