Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Tech

Stop asking chatbots for love advice

As he sat Down opposite me, my patient had a sad expression on his face.

“I had a date,” he announced. “It wasn’t good.”

This was not unusual for this patient. Over the years, he has shared stories of dashed romantic hopes. But before I could ask him what went wrong, he continued, “So I asked the chatbot what to do.”

Um what? Artificial intelligence-based simulation of human conversation—chatbots—has been in the news a lot, but I’ve never had a patient tell me they’ve used one for advice before.

“What did it tell you?” I asked with interest.

“To tell her I care about her jewels.”

“Oh. Did it work?”

– Two guesses, – he sighed and raised his hands. Although this patient was a first, it is now common in my practice to hear from new patients that they consulted chatbots before consulting me. Most often it’s love and relationship advice, but it can also be for bonding with kids or setting boundaries with kids or fixing friendships that have gone awry. The results are clearly mixed.

One new patient asked the chatbot how to deal with the anniversary of the death of a loved one. Take time out of your day to remember what was special about that person, the bot advised. I couldn’t have said it better myself.

“What was written there made me cry,” the patient said. “I realized that I had avoided my grief. So I made this appointment.’

Another patient started relying on artificial intelligence when her friends started losing weight. “I can’t burn my chatbot,” she told me.

As a therapist, I am both alarmed and intrigued by the potential for AI to enter the therapeutic business. There is no doubt that AI is the future. It has already proven useful for everything from writing cover letters and speeches to planning trips and weddings. So why not let it help our relationship too? A new venture called Replika, a “caring AI companion,” has gone one step further and even created romantic avatars that people can fall in love with. Other sites like Character.ai allow you to chat and hang out with your favorite fictional characters or create a bot that you can talk to yourself.

But we live in an age of misinformation. We’ve already seen disturbing examples of how algorithms are spreading lies and conspiracy theories among unwitting or ill-intentioned humans. What happens when we let them into our emotional lives?

“Even though an AI can articulate things like a human, you have to ask yourself what its purpose is,” says Naama Hoffman, an assistant professor of psychiatry at the Icahn School of Medicine at Mount Sinai Hospital in New York. “The goal of a relationship or therapy is to improve the quality of life, while the goal of artificial intelligence is to find the most cited. It shouldn’t help, necessarily.’

As a therapist, I know that my work can benefit from outside support. I have been leading trauma groups for two decades, and I have seen how the scaffolding of a psychological educational framework, especially an evidence-based one like In Search of Safety, facilitates deeper emotional work. After all, the original chatbot, Eliza, was designed as a “virtual therapist” because it asked endless open-ended questions—and you can still use it. Chatbots can help people find inspiration or even break down defenses and allow people to begin therapy. But where is the point when people become overly dependent on machines?

Source by [author_name]

Related Articles

Back to top button