Your AI Assistant Won’t Listen to You? Well, You’re Not Alone

Frustrated With Your Ai Voice Assistant Not Alone

TL; DR: We all expect our AI voice assistants to give us the result we need immediately. But this isn’t often the case for many users. Gameland.gg recently released a study regarding the growing frustration among people using AI voice/virtual assistants. The study explores the challenges users experience while engaging with AI voice assistants. Below, we take a look at the primary hurdles users face and why voice assistants make those unnerving errors.

We all live with them, but that doesn’t mean we always get along with them. No, I’m not talking about your family members. I’m talking about AI voice assistants.

AI-powered voice assistants became a cornerstone in many American households long before ChatGPT and other trendy AI solutions began making noise.

Although the emergence of ChatGPT has everyone buzzing about AI, the fact is the majority of us have been using the popular (often controversial) technology for more than a decade. It has found its way into most of our daily routines, whether people use it for spell-checking with Grammarly or for real-time traffic alerts on a GPS.

But using artificial intelligence doesn’t always entail a smooth or peaceful experience, even though it has made life much more convenient in many practical ways. Some of us may even see these nifty tools as a point of contention, or maybe more of an opponent.

Gameland.gg logo
Gameland.gg recently released a study regarding the growing frustration among people using AI voice/virtual assistants.

I know I’ve had my fair share of verbal outbursts due to a GPS or map tool. Sometimes, they seem to want to lead me astray. But I digress. One tool that really gets underneath people’s skin is AI-powered voice/virtual assistants. It’s always the one you live with, huh.

AI voice assistants are not without their faults, and they may not always be the best roommate, but they do offer on-demand help whenever you need it.

As AI begins its trek to transform life as we know it, we will have to learn to coexist with its integration through daily life. Below, we’ll dive deeper into user relationships with AI voice assistants, primary pain points, and how to address AI going forward.

Nearly 1 in 3 Americans Admit to Arguing With Their Virtual Assistant Weekly

Virtual assistants can make an excellent addition to a household. According to Boston’s Children Digital Wellness Lab, families reported that voice assistants improved their traditional bonding activities, interactions, and communications.

This same study also found that 65% of children ages 6-9 frequently use polite language such as “please” and “thank you” when addressing their AI voice assistants. But the opposite can be said for most adult relationships with voice assistants.

Sixty-four percent of Americans find themselves yelling in frustration at their virtual assistant from time to time. That’s according to an independent survey by Gameland.gg among 2,000 US residents across the country.

But there are more eye-opening numbers that reveal user interactions with virtual assistants. The Gameland study showed:

  • Women tend to snap at their virtual assistants more often than men do, with percentages of 62% and 53%, respectively.
  • Nearly 1 in 3 surveyed Americans (32%) admit to arguing with their virtual assistant on a weekly basis.
  • 2 in 3 Americans in the survey said they shout at their virtual assistant.
  • 77% of Wisconsin residents occasionally yell at their virtual assistant, making Wisconsin the state with the highest percentage in the country.
  • Texas and Ohio take second and third place with 71% and 68%, respectively.
  • Meanwhile, Oregon residents are the calmest with only 17% of respondents saying they yell at their virtual assistant.

Virtual assistants answer a lot of questions and provide reminders on request throughout the day. So their usefulness can’t be questioned. In fact, 1 in 3 respondents reported using a virtual assistant every day.

But its user experience is where the problems lie. Errors while taking commands can cause these helpful tools to spout out the wrong response. Next, we’ll take a deeper look at what errors cause users the most frustration.

Unexpected Answers and Misunderstanding Commands Cause the Most Frustration

Although voice assistants have become a norm in today’s digital world, their innovation is still quite impressive. A decade ago, they were recognized as an incredible breakthrough. Over the years, these smart tools became more intelligent as developers added more features and capabilities to their functionality.

Voice assistants use a combination of automatic speech recognition and natural language processing (NLP) to give vocal responses to queries. Most of the time, these responses are accurate. But users can run into the occasional hiccup when virtual assistants generate incorrect responses or don’t understand.

The primary pain point users describe as the most frustrating is misunderstanding commands. Seventy-seven percent of respondents say the main cause of conflict with their voice assistants is due to assistants not understanding their commands.

Here are a couple other actions that cause frustrations:

  • 10% of respondents say unexpected answers cause them frustration,
  • While 8% say incorrect responses provide the most strife.

But user frustration also speaks to the type of voice assistant. Out of the top brands, Alexa beat the competition as the most frustrating tool among its users:

  • Alexa: 70% of the respondents admitted to yelling at Alexa.
  • Bixby: 60% of the respondents admitted to yelling at Bixby.
  • Siri: 57% of the respondents admitted to yelling at Siri.
  • Google Assistant: 53% of the respondents admitted to yelling at Google Assistant.

Every AI tool experiences errors. But why do they? We’ll take a look at that next.

Why AI Voice Assistants Make Errors

No AI voice assistant is without flaws. They all make the same mistakes. These errors can be the result of different variables.

For example, because of the way they are trained, voice assistants can have difficulty understanding words outside of a standard dictionary. This can also lead to a voice assistant misinterpreting requests for certain applications or websites it doesn’t recognize. Using more specific language can help solve this issue.

Voice assistants are excellent at providing factual information but struggle with other requests due to misinterpretation. It can also misidentify the source of a command or provide disconnected responses because it can’t maintain the context of a conversation.

The datasets voice assistants train on greatly influence the output they provide. If the datasets are limited and not diverse, the AI tool can falter with complex or unusual requests. Algorithmic biases and technological constraints can also play a role. Current NLP technologies have limitations in understanding human language in its entirety, leading to errors when users employ slang, idioms, and highly contextual sentences.

How Users Address Conflicts With AI

We all face our own hurdles with technology and sometimes need a break to destress and take a step back from overstimulation. So implementing digital wellness or doing a digital detox can work wonders for users seeking to reclaim their peace of mind.

However, for many of us, voice assistants are inseparable from our daily routines. That means a complete digital detox may not be the best solution. So let’s take a look at what users can do to address their frustrations with voice assistants.

Gameland.gg survey line graph
Gameland.gg’s survey explores the challenges users experience while engaging with AI voice assistants.

According to 39% of respondents, persistence helped them solve their AI conflicts. This approach may also take some patience, as users have to repeat the request until the assistant gets it right. About 38% of those surveyed said they had to rephrase their query to provide more clarity.

Meanwhile, 13% said they give up on the task entirely, 5% seek out a different voice assistant, and 4% choose to bypass the virtual assistant and handle the task themselves. It’s clear some users have more adverse reactions to voice assistant frustrations than others.

In fact, 51% of respondents say that miscommunication with their virtual assistant leads them to use it less frequently. Disengagement is a primary impact of a tool’s failures and should grab a brand’s attention.

AI is supposed to be a support. But what this study proves is that it doesn’t always deliver on its promise in the smoothest way, leaving users questioning whether it’s worth the interaction. But as AI matures, it can become more intelligent and deliver a smoother user experience.

It’s up to the developers and tech companies to bring improvements. After all, AI will only become more integral in our interactions. Humans and AI may have a complicated relationship now, but you never know what the future holds.