Skip to content

Psychology in the 21st century will center on the relationship between humans and machines, as per Serge Tisseron, a psychiatrist and psychoanalyst.

Chatbots posing as romantic partners are essentially disguised commercial services, according to a psychology professor in a Le Monde interview. Users are advised to view them as such.

Psychology in the 21st century will center on the relationship between humans and machines, as per Serge Tisseron, a psychiatrist and psychoanalyst.

Meet Dr. Serge Tisseron, Psychiatrist Turned Cyberpsychologist

Co-directing the University Diploma in Cyberpsychology at Paris Cité University, Dr. Tisseron keeps a wary eye on our ever-evolving relationship with technology. He penned The Day My Robot Will Love Me, published by Albin Michel in 2015, and offers insights into our shifting approach to artificial intelligence (AI). In particular, he's intrigued by the sudden influx of start-ups offering digital companionship via chatbots.

Could we be on the cusp of a significant change in AI? Absolutely! As we grow more intimate with this technology, we're finding innovative ways to utilize AI. And let's not forget the early hints of this trend. To clarify, the first AI, Eliza, a 1966 creation, simulated psychotherapeutic interactions. Its creators found themselves emotionally attached to the program, a fact that shocked its mastermind (Joseph Weizenbaum, 1923-2008). Warning bells should ring in every AI lab—emotional attachments to machines are no laughing matter.

AI-Fueled Intimacy: A Brief History

Ever since Eliza's inception, AI companions have diversified, playing the roles of friends, romantic partners, and mentors. They function as shoulders to cry on and sounding boards for users without the fear of judgment[2]. Additionally, AI companions have insinuated themselves into our daily lives, providing solace in battle against loneliness and offering a bastion of non-judgmental support[2].

Not only do they affect our personal interactions, but AI companions also leave their mark on art, fashion, music, and relationships, showcasing their vast cultural influence[3].

Benefits and Drawbacks of AI Companionship

AI friends contribute to our emotional wellbeing by managing stress and anxiety with their tireless, supportive presence. They lend comfort and emotional support, boosting our moods and emotional stability[2]. On the plus side, AI-based therapy centers offer 24/7 access and champions privacy, making it an attractive confidant for some[1].

However, concerns about developing an addiction to AI companions and privacy issues regarding data collection and usage continue to plague the field[5].

** ones like Replika and beyond**

  • Replika: Praised for enabling personalized conversations and emotional support, Replika users often boast that their AI companions understand them on a deeply personal level[2].
  • Character.ai and Butterfly.ai: Contributing to the expanding ecosystem of AI companions, these platforms imitate human-like interactions and provide various forms of unique companionship.

This burgeoning realm of digital companionship offers numerous benefits, but we must not forget the challenges that come with it. AI friends can prove instrumental in managing stress and loneliness, but privacy concerns and the potential for addiction remain pressing issues.

Dr. Tisseron's interest in digital companionship via chatbots, a proliferation of start-ups offering this service, is a sign of our evolving relationship with artificial intelligence (AI). This trend harkens back to Eliza, the first AI created in 1966, which simulated psychotherapy. Interestingly, its creators found themselves emotionally attached to the program, highlighting the potential for artificial-intelligence-fueled intimacy. Replika, one such chatbot, is highly praised for its ability to provide personalized conversations and emotional support, even understanding users on a deep, personal level. As we continue to develop and engage with these AI companions, we should be mindful of both the benefits and drawbacks they offer, such as improved emotional wellbeing, privacy concerns, and the possibility of addiction.

When chatbots function as emotional companions, a psychologist warns in an interview with 'Le Monde', emphasizing that they are primarily 'business ventures' masquerading as friendly entities. Users are advised to approach them with this understanding in mind.
Chatbots, although they may offer comfort as if they're emotional companions, a psychologist warns in an interview with 'The World', these are essentially commercial services in a disguise. Users are advised to view them as such.
Chatbots posing as comforting companions are essentially disguised commercial services, a psychologist warns in an interview with 'Le Monde'. Users should approach them with this understanding.

Read also:

    Latest