Can You Trust Online Psychiatric Care and Moral Advice?

Can You Trust Online Psychiatric Care and Moral Advice?
Can You Trust Online Psychiatric Care and Moral Advice?

Can Artificial Intelligence create a genuine sense of wellbeing?

Several companies believe the answer is yes. These firms are building systems that seek to use technology to provide personalized psychological care and advice.

Free Book Return to OrderFree Book: Return to Order: From a Frenzied Economy to an Organic Christian Society—Where We’ve Been, How We Got Here, and Where We Need to Go

 

Given today’s stressful lifestyles, the need for care is real. Moreover, COVID lockdowns make the situation more extreme. In a recent article in Fast Company, Dan Schwabel described those effects.

“The global pandemic has created a seismic shift in workplace mental health, with over three-fourths of workers saying that this is the most stressful year ever…. [P]oor mental health is now inescapable as employees work remotely with no separation between their work and personal lives.” He pointed out that a recent study of his firm, Workplace International, “found that 85% of respondents’ mental health issues at work negatively affect their home life, causing things like suffering family relationships, isolation from friends, reduced happiness, and sleep deprivation.”

Finding practical help for such problems is difficult. One-on-one counseling is expensive and time-consuming. However, cell phones and computers are everywhere. Resourceful entrepreneurs are tapping the devices to deal with mental health issues.

Help Remove Jesus Bath Mat on Amazon

A multitude of products is now available for the psychologically challenged. The descriptions below are not endorsements but descriptions garnered from the companies’ web sites. These are put a sampling from among many providers in this market. Caveat emptor!

Ginger

One product is called Ginger. Its promises are intriguing. “We can help with anything you’re struggling with—from stress and depression to issues with work and relationships. Need to chat on the weekend? Or at 3 AM on a holiday? We’re around. We offer immediate access to support, 24/7/365. We go where your smartphone goes. Get confidential support wherever you are. Available in all 50 states and 23 countries.”

Ginger offers to connect any user with a “trained behavioral health coach” who will provide information and encouragement to those struggling with life’s issues. Ginger uses a customer service “chat” format in which the user types in a concern and the “coach” helps solve the problem.

Satanic Christ Porn-blasphemy at Walmart — Sign Petition

Ginger offers to connect users with licensed therapists and psychiatrists for issues that are too big or complex for coaches to solve. The therapist appears on the screen for a more intense discussion.

Woebot

Woebot claims to provide “technology with heart” by “infusing artificial intelligence with the empathy and expertise of a therapist to create a powerful mental health solution that can reach millions.” Like Ginger, it uses the standard chat function. However, there are no humans at the other end. Instead, they use “a suite of clinically validated therapy programs that address many of today’s mental health challenges.”

The application was built by psychologists from Stanford University. The company claims that the app is so sophisticated that it will “understand” the user’s state of mind and ask the “right” questions. It will then use the responses to “safely deliver the right intervention to the right person at the right time.”

Biobeats

Biobeats introduces a natural element into the mental health calculation. It monitors physical health – heart rate, respiration, sleep, and other metrics. From this information, it can examine moods and “cognitive functions.” Everything is then loaded into its “digital therapeutics and mindfulness tool.”

As with Ginger and Woebot, Biobeats targets employers seeking to help their employees. All three tout the benefits of a contented set of employees who become more productive through better mental health.

Ample Reason for Concern

This trend of mental health care through artificial intelligence is problematic.

One main concern is the competence of the coaches and software designers to deal with these problems from afar. The human mind is extremely complex, and mental issues often involved situations that require discernment and nuance. What level of quality will be dispensed by those sitting at a computer at three in the morning waiting for someone’s troubles to pop up on their screen? What kind of accountability will there be for coaches whose advice could have dramatic consequences or even trigger life-threatening reactions?

How Panera’s Socialist Bread Ruined Company

There also are philosophical objections. It makes sense to question the philosophical assumptions being designed into these applications. Most programs will not consider moral issues that deal with sin or Grace. They will tend to be naturalistic or even mechanistic solutions that threat humans like machines, not rational creatures.

[like url=https://www.facebook.com/ReturnToOrder.org]

Even the most conscientious programmer cannot factor the Grace of God into programs. Yet, that Grace is an inseparable part of any physical, psychological, or moral healing. Sacramental life provides elements for dealing with life’s vicissitudes. Even the best of these programs will always have the limitations of humanity. Holy Mother Church is infinitely more qualified to deal with the vagaries of human life than any computer program, or chat room will ever be.

© Adobe Stock/sdecoret