Guest post by Kate Lynch, a business and digital marketing blogger.
In late January this year, Google unveiled Project Meena, an “end-to-end, neural conversational model”. With a groundbreaking new architecture (“Transformer seq2seq”), Google claims that Meena represents the next stage in the evolution of chatbots.
Only a couple of weeks earlier, Samsung had revealed its Project Neon at CES 2020. This humanoid AI chatbot takes inspiration from sci-fi and puts a human face to chatbot interactions.
Both Meena and Neon represent dramatic shifts in the way we interact with chatbots, or indeed, any technology. While Meena adds a human-like conversational intelligence to computer interactions, Neon gives it a human face – so crucial for empathy.
What do these developments mean for artificial interactions? Given these changes, what will the future of computing look like? I’ll discuss these crucial questions, and more, below.
Google Meena and Smarter Chatbots
Although chatbots have become increasingly smarter, a lot of them are still limited to the responses hardcoded within them. You might have noticed this yourself – unless you phrase questions in a specific manner, you aren’t guaranteed a satisfactory response.
Google Meena is different because it treats a series of questions as one conversation. That is, it understands the context behind each question. For instance, if you ask Meena about New York, then ask it about the weather “there”, Meena will infer “there” to mean New York – just as a human would. Meena keeps track of up to seven turns in a conversation. This allows it to answer questions based on their context. As in the example below, Meena understands that “TNG” refers to what it has already talked about – Star Trek.
In simpler terms, this context-specific conversation is called a “multi-turn” conversation. Multi-turn conversations aren’t new – Google itself debuted an eerily convincing one in 2018 named Google Duplex. Google Duplex could keep track of an entire conversation and understand the difference between 4 people and 4 PM when booking a restaurant reservation. However, Duplex was, upon release, limited to a handful of specific use cases, such as booking appointments.
What makes Meena unique is that it is truly open-domain. That is, it can talk about any topic under the sun – once again, like a human being. In tests, Google Meena scores an impressive 79% on the Sensibleness and Specificity Average (SSA) test. Human beings score 86% on the same test.
Image source: Medium.com
In other words, for the first time since the invention of chatbots, we have a bot that can talk to you almost like a human. You can only imagine how much better this bot will be in a few years.
Samsung Neon, Digital Avatars, and Artificial Empathy
Siri, Alexa, and Cortana might be nothing but 0s and 1s inside a computer, but that doesn’t stop people from getting emotionally attached to them. One study estimates that, quite like the plot of the movie Her, 1 in 4 people even fantasize about these smart assistants.
What happens when you give these collections of bits and bytes a human face and a personality to match? That’s the question Samsung Neon seeks to answer.
Neon creates “digital avatars’ – lifelike, virtual humans who pull double duty as chatbots. Unlike Siri or Alexa, you interact with them as you with a human being (the screen notwithstanding). They can emote and react with some degree of lifelike animation.
Image source: Dezeen.com
The result is an interactive experience that is significantly more empathetic. If Siri and Alexa can move us with their words and (programmed) humor, the effect is amplified manifold when the smart assistant is human-like.
Significantly, these digital avatars offer something that voice-only assistants can’t – face-to-face interaction. Human beings are hardwired to respond to faces. Research even suggests that to be truly empathetic, we need to see the respondent’s face. Digital avatars, thus, transcend the empathy gap in artificial interactions. The result is a more authentic, lifelike experience – something missing from the chatbots and smart assistants of today.
The Future of Artificial Interactions and Your Business
Think back to how Tony Stark interacts with computers. Fictionalized though it may be, the Iron Man vision of the future rarely, if ever, involves touch-based computing interactions. You don’t see him create a project dashboard in Excel and hammer away on a keyboard. Instead, everything is passed off to a smart assistant – JARVIS – via voice.
The same pattern is repeated across countless sci-fi movies. Theodore Twombly in Her talks to his smart assistant (and – spoiler alert – even falls in love with her). K in Blade Runner has a virtual holographic human assistant. Dr. Bowman in 2001: A Space Odyssey interacts with HAL primarily through voice.
Sci-fi doesn’t always show the future but it does influence it. And in sci-fi, artificial interactions usually involve one or all of the following – voice-focused interactions, smart assistants, and digital avatars.
As Google Meena and Samsung Neon show, we’re slowly, but surely moving towards this sci-fi future. Better chatbots that can understand true intent and conversational context will change how we interact with computers. When paired with digital avatars, these chatbots can ascend to near human-like realism.
That’s the big picture view.
But at a more granular level, how will this affect you and your business? Can a chatbot that can talk about anything also talk about your business and products? If yes, how do you ensure that it says the right things about them?
To achieve this, you need chatbots that have better access to the right data, i.e. your data. You need to train them on your internal knowledge base, manage their insight, and integrate your data with their natural language understanding.
If you can offer this data through mediums your customers already understand – voice and text chat – you’ll be one step closer to realizing the aforementioned sci-fi future.
Over to You
Google Meena and Samsung Neon show the possible future of artificial interactions – voice-based and human-like. From keyboards and touchscreens, it is not unreasonable to say that in a few years, we’ll increasingly shift to voice as a key interactive medium.
Taking advantage of this shift requires smarter chatbots that are trained on your data. When you combine the natural language capabilities of such chatbots with your own business-focused data, you can create a friendlier, smarter customer experience.
Whichever path the future takes, you can at least be sure that the journey will be exciting.
Read more about chatbots & voice assistants:
Kate Lynch is a business and digital marketing blogger who spends her entire day writing quality blogs. She is a passionate reader and loves to share quality content prevalent on the web with her friends and followers, keeping a keen eye on the latest trends and news in those industries. Follow her on twitter @IamKateLynch for more updates.