Communication Re-Imagined with Emotion Ai

This content has been archived. It may no longer be accurate or relevant.

From ReadWrite:

There has long been a chasm between what we perceive artificial intelligence to be and what it can actually do. Our films, literature, and video game representations of “intelligent machines,” depict AI as detached but highly intuitive interfaces. We will find communication re-imagined with emotion AI.

. . . .

As these artificial systems are being integrated into our commerce, entertainment, and logistics networks, we are witnessing emotional intelligence. These smarter systems have a better understanding of how humans feeland why they feel that way.

The result is a “re-imagining” of how people and businesses can communicate and operate. These smart systems are drastically improving the voice user interface of voice-activated systems in our homes. AI is improving not only facial recognition but changing what is done with that data.

. . . .

Humans use thousands of subverbal cues when they communicate. The tone of their voice, the speed at which someone speaks– these are all hugely important parts of a conversation but aren’t part of the “raw data” of that conversation.

New systems designed to measure these verbal interactions are now able to look at emotions like anger, fear, sadness, happiness, or surprise based on dozens of metrics related to specific cues and expressions. Algorithms are being trained to evaluate the minutia of speech in relation to one another, building a map of how we read each other in social situations.

Systems are increasingly able to analyze the subtext of language based on the tone, volume, speed, or clarity of what is being said. Not only does this help these systems to identify the gender and age of the speaker better, but they are growing increasingly sophisticated in recognizing when someone is excited, worried, sad, angry, or tired. While real-time integration of these systems is still in development, voice analysis algorithms are better able to identify critical concerns and emotions as they get smarter.

. . . .

The result of this is a striking uptick in the ability of artificial intelligence to replicate a fundamental human behavior. We have Alexa developers actively working to teach the voice assistant to hold conversations that recognize emotional distress, the US Government using tone detection technology to detect the symptoms and signs of PTSD in active duty soldiers and veterans and increasingly advanced research into the impact of specific physical ailments like Parkinson’s on someone’s voice.

While done at a small scale, it shows that the data behind someone’s outward expression of emotion can be cataloged and used to evaluate their current mood.

Link to the rest at ReadWrite

1 thought on “Communication Re-Imagined with Emotion Ai”

  1. isn’t that exactly why psychopaths do? observe and catalog emotions (that they don’t experience), mimic them in order to manipulate the beliefs and behaviors of others? at the moment the only distinguishing characteristic of AI vs a human psychopath is an intrinsic set of wants/goals borne of self-awareness. AI’s goals are currently provided by human programmers (who hopefully/presumably are not psychopaths themselves).

Comments are closed.