SEMA News—January 2020
Selling With Emotion
Emotion AI will be proliferating through society, ready or not, so it makes good sense to know what’s going on and collect insights on how the technology can reportedly help you make more sales.
Given that face-to-face selling often results in communications breakdowns, it’s no wonder that artificial intelligence (AI) programmers are designing solutions that enable machines that help sell by sensing customer emotions. These AI solutions are able to use cameras and listening devices to determine if a customer is happy or sad, angry or disinterested, and alter their sales pitches based on that assessment.
The new tech, known as Emotion AI, is so hot that market research firm Gartner predicts it will be embedded in 10% of all personal tech devices by 2022
“Companies lose billions of dollars each year due to poor customer service,” said Brian Loveys, program director with IBM’s Watson Products Group. “In fact, 53% of U.S. adults are likely to abandon their online purchase if they cannot find quick answers to their questions.”
Simultaneously, businesses “know that artificial intelligence has the potential to improve customer service in new ways,” Loveys added.
In practice, those new ways (as implemented in the retail sector) have given rise to store shelves that sense and respond to customers walking by. And Emotion AI is also springing up in call centers, where AI-driven software listens in on calls and text chats, responding to customers based on their emotions and tipping off human agents about the next caller’s mood.
Still other applications serve as an AI eye-in-the sky for retailers. They determine whether customers leave the store satisfied. And some can even send coupons to customer smartphones to clinch a sale.
One caveat: There are some psychologists who question the accuracy of these AI systems, as evidenced by a recent report released by the Association for Psychological Science (www.psychologicalscience.org/news/releases/emotion-expressions-pspi.html). The report’s premise is that it’s possible for AI systems to easily misread emotions, given that the look on someone’s face often belies what he or she is feeling, according to Lisa Feldman Barrett, a co-author of the report.
“On average, people scowl when angry approximately 25% of the time,” Barret said. “But they also move their faces in other meaningful ways when angry. They might cry or smile or widen their eyes and gasp. And they also scowl when not angry—such as when they are concentrating or when they have a stomachache. Similarly, most smiles don’t imply that a person is happy. And most of the time people who are happy do something other than smile.”
Either way, Emotion AI will be proliferating through society, ready or not, so it makes good sense to know what’s going on and collect insights on how the tech can reportedly help you make more sales. Below are a few ways.
Shelfpoint (www.shelfpoint.com): One of the higher-profile Emotion AI applications, Shelfpoint transforms the sides of retailer shelves that face customers into emotion-sensing devices, which change colors, offer different graphics, and change pricing based on the emotions they sense.
Manufactured by Cloverleaf, Shelfpoint works by using tiny cameras on the shelves that continually feed images of passing customers to the system’s AI software. Perceptions collected include customer facial expressions and customer emotions, including joy, sadness, anger, fear or surprise. The tech also collects other customer characteristics, including age, gender and ethnic group.
AI analysis of all the data is done in the cloud on Shelfpoint’s servers. The software subsequently triggers changes in the graphics and pricing on the sides of shelves based on what it “sees.”
Down the line, Cloverleaf is promising even more. An enhancement of the system will trigger changes in the graphics and pricing based on shopper age and/or the gender it perceives.
IBM Tone Analyzer (www.ibm.com/watson/services/tone-analyzer): Yet another AI application spun off from IBM’s famous Watson computing system (which bested human competitors on the TV show “Jeopardy” in 2011), Tone Analyzer distills the emotions behind what people are writing online, determining whether they’re happy, sad, confident, angry, empathetic and the like.
The tech is designed to be integrated with chatbots. So in a perfect world, your automated sales chatbots will be able to pitch your customers with Tone Analyzer while sensing your customers’ reactions to those pitches.
Granted, many people feel that chatbots are currently more of an annoyance than anything else, but companies such as IBM are promising that next-generation chatbots will be imbued with AI and will be able to engage in sophisticated conversations—a next-generation improvement that could make Tone Analyzer potentially very potent.
Tone Analyzer can also be integrated into call-center systems and enable call-center managers (using a dashboard) to monitor the emotions of all callers at all times. With that kind of oversight, managers will be able to jump in and take over a sales call that is going sour. Or they can task especially gifted salespeople to do the same.
With Emotion AI, machines are able to alter sale pitches and customer service based on the emotions they sense.
IBM Watson Assistant (www.ibm.com/cloud/watson-assistant): Similar to Tone Analyzer, Watson Assistant enables a centralized, automated answering system to help customers looking to make a purchase, looking for service, or looking to engage with your business in any number of other ways. Far from a know-it-all, Watson Assistant is programmed with AI to recognize the questions it can answer, provide a knowledge-based article to respond to more in-depth questions, or transfer an internet chat request to a specialized, live agent.
Businesses can customize Watson Assistant to their needs by inputting numerous examples of how online chats questions are handled by their current human operators—including how the human operators handled anger, impatience and similar emotions in customers. The AI system uses those examples to train itself on how to handle future chats, mirroring similar questions and situations.
The AI promise of Watson Assistant is that over time, using both histories of chats and its own experiences of answering questions, it will continually get smarter and more proficient at handling any type of customer question that comes its way.
Market Builder Voice: Another call-center solution, Market Builder Voice uses AI to analyze the emotional state of every person who calls a company or store. Developed by Audeering (www.audeering.com) and GfK (www.gfk.com), the solution is also designed to assess the emotional state of customers and other callers who are waiting on hold for service from company representatives.
The underlying concept is that your sales staff or customer support staff will know the customer’s mood even before they pick up the phone. Interestingly, Market Builder Voice is also programmed to distinguish between authentic comments and sarcasm. So when your customer tells a salesperson “I really like your service,” you’ll know right away if that’s a good or a bad thing.
According to Audeering and GfK, the solution is effective with people from all cultural backgrounds. And its use reportedly results in sales and service calls that are shorter and more productive.
In-Shop C-SAT (www.entropiktech.com): Manufactured by EntropikTek, In-Shop C-SAT is a kind of eye-in-the-sky for your store. It tracks whether or not your customers are leaving your establishment happy or sad. The AI system also uses facial recognition to identify repeat customers entering your store, and it can be programmed to offer those repeat customers personalized tips and/or coupons to clinch a sale.
Joe Dysart is an internet speaker and business consultant based in Manhattan.