Originally Published in Destination CRM
Written by Mark Smith
In a time when so many of the interactions we used to have face-to-face are moving online and workforces are distributed, the type of organic emotional capture we used to naturally do—responding quickly to an upset in-store patron, for example, simply can’t be done. Gauging emotion from afar, in digital environments, is becoming even more important.
The relationship between technology and human emotion is one that’s increasingly characterized by blurred lines. Robots, more so now than ever before, are capable of mimicking, replicating, and, most importantly, understanding human behavior.
The emotional connection between humans and technology is moving from the figurative to the literal. On the one hand, you have technology that can more and more realistically mimic human emotion. At the Consumer Electronics Show this year, for example, Samsung subsidiary STAR Labs debuted a project called Neon, featuring realistic human avatars that can interact in real time and rely on individualized AI models to replicate human behavior.
On the other end of the spectrum, there’s the still-blossoming field of artificial emotional intelligence, or emotional AI. Instead of replicating human behavior and emotions, like Neon, emotional AI seeks to perceive and interpret human emotions. It has been around for a long time, primarily in the form of sentiment analysis—companies of all sizes have been investing in ways to understand human emotions through video or audio for more than a decade. We’re beginning to reach something approaching critical mass, however, with Gartner estimating that emotional AI will influence more than half of all the ads you’ll see online by 2024.
Read the rest at Destination CRM : https://www.destinationcrm.com/Articles/ReadArticle.aspx?ArticleID=140693