Close Menu
Tech Savvyed
  • Home
  • News
  • Artificial Intelligence
  • Gadgets
  • Apps
  • Mobile
  • Gaming
  • Accessories
  • More
    • Web Stories
    • Spotlight
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

What's On
Xbox Game Pass just got cheaper, and I’m not complaining about the pivot it comes with

Xbox Game Pass just got cheaper, and I’m not complaining about the pivot it comes with

22 April 2026
ChatGPT Images 2.0 is here, and it’s way more than an upgrade

ChatGPT Images 2.0 is here, and it’s way more than an upgrade

22 April 2026
Chatbots are getting too emotional and customers are not happy about it

Chatbots are getting too emotional and customers are not happy about it

22 April 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Tech Savvyed
SUBSCRIBE
  • Home
  • News
  • Artificial Intelligence
  • Gadgets
  • Apps
  • Mobile
  • Gaming
  • Accessories
  • More
    • Web Stories
    • Spotlight
    • Press Release
Tech Savvyed
Home»News»Chatbots are getting too emotional and customers are not happy about it
News

Chatbots are getting too emotional and customers are not happy about it

News RoomBy News Room22 April 20262 Mins Read
Chatbots are getting too emotional and customers are not happy about it
Share
Facebook Twitter Reddit Telegram Pinterest Email

When a customer service representative says, “I totally get your frustration,” it feels natural. When a chatbot says the same thing, something feels deeply off. Now, researchers have confirmed that gut feeling with actual data.

As reported by Techxplore, a new study published in MIS Quarterly finds that when AI chatbots express empathy during a service failure, it can actually make things worse for customers, not better.

Why does chatbot empathy backfire?

The research team from McGill University, University of South Florida, and Hong Kong Baptist University ran three separate experiments, where participants interacted with a service chatbot that made mistakes. In some cases, the chatbot responded with empathetic phrases such as “I really feel your frustration” after the errors. In others, it simply moved on without acknowledging the customer’s emotions.

The empathetic responses did not go over well. Instead of calming customers down, they triggered what researchers call “psychological reactance”, an instinctive negative response when people feel their sense of control or freedom is being threatened.

The idea that a machine had analyzed and responded to their emotional state felt invasive rather than comforting. This led customers feel less satisfied with the overall service.

A person looking frustrated at a laptop while sitting at a table.

This aligns with my personal experience. When chatbots like ChatGPT try to be too encouraging or understanding, it feels off. It’s akin to the uncanny valley effect I experience when watching AI-generated content. When you know you are chatting with AI, false emotional support irks you more than straightforward responses. 

So what should chatbots do instead?

The researchers suggest that companies should not automatically equip chatbots with empathy features, especially when handling service failures. The benefits of human empathy do not simply transfer to AI.

Chatbots can explore other approaches, like humor, compliments, or a straightforward apology, that don’t carry the same invasive undertone.

The takeaway is clear. Making a chatbot sound more human isn’t always the right move. Sometimes, it is best to let a bot be a bot.

Share. Facebook Twitter Pinterest LinkedIn Telegram Reddit Email
Previous ArticleFramework Laptop 16 gets a better touchpad, a new look, and a wild eGPU trick
Next Article ChatGPT Images 2.0 is here, and it’s way more than an upgrade

Related Articles

Xbox Game Pass just got cheaper, and I’m not complaining about the pivot it comes with

Xbox Game Pass just got cheaper, and I’m not complaining about the pivot it comes with

22 April 2026
ChatGPT Images 2.0 is here, and it’s way more than an upgrade

ChatGPT Images 2.0 is here, and it’s way more than an upgrade

22 April 2026
Framework Laptop 16 gets a better touchpad, a new look, and a wild eGPU trick

Framework Laptop 16 gets a better touchpad, a new look, and a wild eGPU trick

21 April 2026
The Logitech MX Vertical drops to , and if you spend more than a few hours a day at a mouse this deal is worth your attention

The Logitech MX Vertical drops to $74, and if you spend more than a few hours a day at a mouse this deal is worth your attention

21 April 2026
YouTube is coming for celebrity deepfakes with new AI likeness detection tech

YouTube is coming for celebrity deepfakes with new AI likeness detection tech

21 April 2026
Some of Galaxy S26’s new tricks are heading to older Samsung flagships and foldables

Some of Galaxy S26’s new tricks are heading to older Samsung flagships and foldables

21 April 2026
Demo
Stay In Touch
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Vimeo
Don't Miss
ChatGPT Images 2.0 is here, and it’s way more than an upgrade

ChatGPT Images 2.0 is here, and it’s way more than an upgrade

By News Room22 April 2026

OpenAI is back with another upgrade to ChatGPT’s image capabilities, and this one feels less…

Chatbots are getting too emotional and customers are not happy about it

Chatbots are getting too emotional and customers are not happy about it

22 April 2026
Framework Laptop 16 gets a better touchpad, a new look, and a wild eGPU trick

Framework Laptop 16 gets a better touchpad, a new look, and a wild eGPU trick

21 April 2026
The Logitech MX Vertical drops to , and if you spend more than a few hours a day at a mouse this deal is worth your attention

The Logitech MX Vertical drops to $74, and if you spend more than a few hours a day at a mouse this deal is worth your attention

21 April 2026
Tech Savvyed
Facebook X (Twitter) Instagram Pinterest
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2026 Tech Savvyed. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.