On May 22, my old friend Manu and I had a long overdue catch up in Singapore, resulting in a discussion, that led in a bunch of conversations, that lead to the article below. — Peng T. Ong


I have come to the conclusion that machines have the potential to show more empathy than 95% of humans. Allow me to explain this difficult pill to swallow.

I asked ChatGPT to define empathy, and this is the answer that it provided:  “Empathy is commonly understood as the ability to understand and share the feelings of others. It involves recognizing and responding to emotions, demonstrating compassion, and showing a willingness to help or support others in times of need.”

I interviewed scientists, thought leaders, and philosophers on the topic of empathy and have extracted some interesting observations. The first is that “the English term, ‘empathy’ is a relatively new term, coined in the early 1900s. The word presumes a non-judgmental immersion into someone’s life, stepping into their shoes and viewing it through their lens. When you think about it, can one truly be empathetic, or  are they just psychologically conditioned to make empathetic statements?” says modern day philosopher Dash Tirthankar.  

I talked to another philosopher/engineer, Lalit Katragadda, who went deep into history and agreed that while the Western term  is just over 100 years old, the eastern term describing empathy is deeply rooted in Indian history.  “If you take the ancient Indian terms of Karuna and Peeda (Pida), empathy has been studied deeply and been around for ages, has been brushed aside as irrelevant in a process driven industrial era approach to service.”  

Lalit further divides empathy into 3 levels:

  • Level 1: Listening. Example: “I understand that you are having a hard time.”
  • Level 2: Caring. Example. “I understand that you are having a hard time, and I will do what  I am assigned to do to resolve your problem within the constraints that I have.”
  • Level 3: Handling with care/going above and beyond for the client. “I understand that you are having a hard time. I will do everything in my power to help you. I am here for you.”


Despite the disagreement on the origin of the word for empathy, Lalit and Dash agree that AI can  be programmed, even empowered, to simulate empathy and close that gap to ~95% by mastering levels 1 and 2. That is enough to dramatically enhance millions of white collar jobs in service. I would like to provide some examples from my life to convey my point.

Personally, I will do almost anything to avoid having to talk to a customer service representative. In many cases the experience is painful, the problem is only occasionally resolved, and in the end I am left with a bad taste in my mouth.  Recently, I endured a set of experiences dealing with different customer care employees at United Airlines. Each call left me reeling in the end. My mom was sick with COVID19 and could not travel back home to San Jose from Singapore, and I was trying to help her reschedule a flight. The United Airlines customer success process and agents sent me through long wait times, endless hoops to send documentation, and empty promises of call-backs, requiring me to call back multiple times. Eventually I was advised that nobody could help because I was calling on a weekend, and the department that receives and forwards the documentation is not open on weekends. Finally, the call required me to escalate.  

Mind you, this was my experience as a business class traveler, where a higher level of customer care is the norm in most instances.  I felt like nobody really cared enough to take ownership of helping me. While the agents are trained to make robotic-like empathetic statements such as, “I understand this is a difficult situation for you,” their actual empathy does not extend deeply enough to help you reconcile the reason that you called. In effect, United Airlines has trained their employees to practice empathy at levels 1 and 2, but not to level 3.

I pondered this experience with my wife, a physician who is employed by a large healthcare organization, and concluded that this limitation of empathy is rampant in the corporate world because individuals are just a cog in a very large wheel.  There is a loss of empowerment as managers strive towards standardization and in the process make employees feel like they are expendable and irrelevant.  During this standardization practice, corporations spend billions of dollars on training and reinforcing different degrees of empathy into their employees because there is a correlation to lifetime value (LTV).  There is an economic incentive for them to drive forward programs on empathy, but the way in which they approach it can drastically affect the result.  

Since we discussed the limited empathy and empowerment at United Airlines, let us contrast it with its partner airline, Singapore Airlines.  Singapore Airlines has poured tens of millions of dollars into their customer service experience and has managed to create an empathetic environment by creating a sense of empowerment in its employees at all levels.  By replacing process training with empowered training they encourage their employees to retain customers through their humanity. Rather than saying, “I understand that this is a difficult situation, but unfortunately I cannot help you due to our policy,” they say, “That sounds like a difficult situation. Let me do whatever it takes to help you.” The reality is that most companies do not have so much money to spend on this seemingly esoteric endeavor.

Then there are the fields that naturally beckon compassionate individuals, like healthcare workers and teachers. One might assume that these individuals exude empathy at all hours of the day.  However,  even healthcare workers and educators suffer from a phenomenon called “empathy fatigue.”  Depending on the time of day or the day of the week, you may come by a physician, who you think, “Boy, that doctor is a real jerk! What does she think of herself?”

Case in point, even individuals who are naturally drawn to the more caring fields struggle with empathy much of the time.  Even in these caregiving fields, empathy does not come naturally. Hospitals require physicians and nurses to participate in Empathy Training Programs, where they learn how to communicate in a way to make others feel understood. There is no way that people in the “caring fields” can feel empathy for every individual that walks through the door. In fact, it is unlikely that they can feel empathetic to more than a handful of people in their close circle. Nonetheless, they need to communicate empathy in order to retain their client base. Now insurance-based and private equity companies disempowering physicians (service providers) has resulted in healthcare workers feeling demoralized and expendable, adding to the erosion of their sense of empathy.

Now let us examine a customer service experience with an AI Bot. Recently I had to reach out to American Express due to a billing error.  I was transferred to an AI Bot that made empathetic statements for my situation, similar to the United Airlines representatives,  and resolved the concern within a few minutes. In the United Airlines situation, the multiple customer service representatives with whom I spoke exhibited Level 1 empathy. In the case of American Express, there was no requirement to be transferred to a human being. The AI Bot recognized there was a problem, made a statement demonstrating compassion, did a deep dive into my account history, and then resolved my problem, which is consistent with Level 2 empathy. Now, I know that a machine cannot FEEL empathy, but it can communicate a sense of empathy nonetheless. The AI bot made me feel heard, understood, and worked within its programmed constraints to help me.

I can understand how a customer service agent can easily feel worn down having to deal with people’s problems day in and day out. Having to communicate to others that you understand their feelings and situation, even if you really don’t internalize them, is exhausting. In fact, it is  impossible to internalize one’s problems and truly feel empathy because that would require an individual to have all the same experiences. Furthermore, humans are built to extend empathy to only a small circle with whom they share personal experiences and cultural norms.  Instead of feeling true empathy, we settle for making empathetic statements as a show of compassion.  It is, in effect, simulated empathy. Despite all the training that service individuals  go through, how can one individual maintain a persona of empathy all the time? We are just human, and we have our limitations. At some point constantly showing empathy for others wears on our compassion for ourselves.

This is where AI will shine. Just as humans can make empathetic statements, AI can be programmed to consistently simulate empathy using data involving patterns of human behavior to a Level 2. To get to a Level 3 would certainly require a human touch. The difference is that machines don’t get tired.  “AI has an order of magnitude more time and infinite mind space to help you.  And layered with that, it can be taught to listen partly, care and resolve to the best of its empowerment”, says Lalit Katragadda.  Moreover, machines don’t suffer from empathy fatigue, and therefore they are able to solve the problem at hand effectively.  

Peng T. Ong, a notable serial entrepreneur, wraps it up succinctly, “Companies will, over time, act with much more empathy ... because if they don't, the efficient tech world will notice, and the low EQ companies will lose.  The march towards AI is imperative and I look forward to having much better behaved corporations.”

As a tech investor, I am excited about an AI-led future because it will increase productivity immensely. However, there are a lot of fears that go along with new technology. My collaborators and I are curious to know where you stand on this topic.

COLLABORATORS:
I had the privilege of debating these ideas with people from a fascinating range of perspectives.  

  • Lalit Katragadda:  Engineer extraordinaire.  Mastermind behind Google Maps and many other projects.  Current founder of Indihood focused on the next billion.  He is a product of Stanford, Carnegie Mellon and IIT Bombay.  Based out of Bangalore.
  • Ricardo Araujo: Professor of Data Science and AI, senior researcher and consultant at the AI Innovation Hub in Brazil. Someone I have worked closely with since 2005.  Ricardo also heads up data science and AI at Inventus Capital
  • Peng T. Ong: Serial entrepreneur turned to the dark side like me.   Founded Match.com and Interwoven, and Encentuate. He is focused on investing in Southeast Asia tech startups.  He is a deep thinker and currently is based out of Singapore.
  • Dr. Rachna Rekhi: My better half. Physician and founder of Wellness Wizards, focusing on healthcare rather than disease management. She recently reduced her hours at her hospital so that she could pursue her passion of helping others learn how to grow healthier. She hosts a health podcast along with our two savvy children, Maya and Neal.
  • Dash Tirthankar: Dash is a philosopher and made a deep imprint on me during a short stint of 3 months in Singapore.  A practical engineer by training turned to the more impractical philosopher but successful in every measurement that really matters.


Written by Manu Rekhi. This article was originally published on: https://www.linkedin.com/pulse/empathy-human-vs-ai-manu-rekhi.

Download
Report

Download PDF
Thank you. If your download has not started, please click the button above.
Oops! Something went wrong while submitting the form.

Download
Report

Download PDF
Thank you. If your download has not started, please click the button above.
Oops! Something went wrong while submitting the form.