Unconscious imprinting from having grown up a Gen-Xer during the original Star Wars years tells us that robots are futuristic. Except that is no longer so, and hasn’t been for a while, now.
One of the features that made robots from Star Wars so awe-inspiring when we were children was their ability to understand and express human sentiments and behavior. Think: Little R2D2 rolling away offended, or C-3P0’s exaggerated politeness contributing to building bonds and empathy with viewers.
Now that AI and robots are “invading” our daily lives, especially for shopping and e-commerce purposes, we have selected three robots and related technologies that are starting to integrate meta-communication aspects, aimed at going beyond purely utilitarian language to make the communication more human in our daily lives as consumers.
Meta-communication: When chatbots speak from another dimension
Who hasn’t been stuck with a customer support chatbot commanding you to “simplify your question”? This sort of experience would leave anyone wondering why chatbots exist, since they only appear to comprehend questions that have been previously listed in the FAQ section of the company.
Instead, what if bots were able to pick up on the emotional state of a customer support request; if a customer is angry, for example? Of interest is the conceptual idea of Charly the Chatbot being taught to understand emotional communication and to react accordingly. This intuitive ability will help to lighten the mood between customers and brands with entertaining, small-talk conversations. Understanding secondary, unspoken meanings would make chatbot communication and problem-solving with customers much more efficient.
Body language: An army of clones could help
You may have already encountered an in-store assistant robot in the stores of some larger retailers. However, interacting with a robot in a public space still holds a dose of awkwardness.
Speaking live with another person means using words, tone of voice, facial mimic, and gestures. But what “face” is one supposed to display when talking to a robot in front of other people? One quality that makes Pepper, the humanoid robot so well-received is its capacity to use and mimic some body language.
Pepper can move its arms, turn its head toward you, and even use its eyes, (not a weird built-in scanner), to scan your coupons. These user-friendly features are certain to delight any customer, and can even serve up some nostalgic vibes to Gen-Xers and older who dreamed of communicating with C-3PO.
Cultural awareness: The whole galaxy matters
When Siri was first released a few years ago, it seemed that her main function was to have people shout unspeakable sentences to her to test her array of smarty, generic retorts.
Now that voice and conversational commerce are getting a foothold in our everyday lives, the time has come to move on from voice assistants with a mono-cultural background expressed with all-purpose phrases. The way we relate to shopkeepers and salespeople may greatly vary based on geography and culture.
Since voice devices are mostly programmed at the headquarters of their companies, this is an aspect to take carefully into account, even more so when attempting to make the robots more “human” through casual chit-chat. For example, some cultures may attribute different modes of communication based on gender or age, while others may need to announce expressly that a sentence is to be understood as humor.
The current excitement around robotics and artificial intelligence is starting to provide answers on the optimal ways to have robots interact with customers, and where to draw the line between a frustrating or awkward shopping experience versus a satisfactory, mutually beneficial one.
After all, as C-3PO said: “He’s quite clever, you know… for a human being.”
Martin Stocker is the co-author of this post.