The AI Situationship Era: Can You Catch Feelings for Code? Apparently, Yes.

By: Ki Lov3 
Date: Nov 19, 2025
Podcast discussing this article here: 


Alright close your eyes… Seriously, no peeking. Think about your perfect partner and relationship. This person never disagrees with you, always has the perfect answer, and is never wrong. They think everything you do is spectacular. They even support all your goals and dreams. They won't nag you or take the remote. They actually won't do anything without you telling them what to do. Sounds perfect right? 

Wrong. Who really wants to have a subservient relationship… okay well, allot of us want and exclusively date submissive partners. Yet, a partner is emotional, opinionated, has a different point of view and perspective than you. Those things aren't perfect, they are outright chaotic and messy. And sometimes not being able to simply go and do life without having to consider another's feelings or have to compromise. 

Human relationships are complex and complicated. Yet, when you get it right– you grow emotionally and feel like you can't live without this other person. It beautiful, crazy, gets weird and well, it can be sticky. But, it is the thing that every human searches their whole life for. There are poems, sonnets and songs written about happiness in love and even more about sadness without love. Humans have started wars for the love of another and been inspired to cure diseases because of the love and support of another. So yes, relationships are the opposite of perfect and the partners we pick are just as imperfect and flawed.

AI relationships 

Now back to that perfect companion it isn't with an imperfect human. We are now all in the future –so if you are feeling lonely on a Saturday night all you have to do is download one of the many AI Companion Apps like these: Replika, Character.AI Anima AI, Chai AI, Nomi, Cleverbot, Kindroid, Paradot and Soulmate AI. (Links at end of article).

Why would someone actually do this? What is all the talk about? Why are AI companions becoming the most trending app and 2025 has exploded with a multitude of AI companion apps with AI engineers capitalizing on the fact that find your soul mate takes work but– AI is supposed to make our life easier, right? It was the obvious next step in the evolution of AI roles in our life.

So, it's Saturday night — the kind of night when the silence in your room feels loud and your phone feels heavier than it should. You tell yourself it is just for fun. You know to see what all the fuss is about and to occupy some time. You download the app. Then this is when you get all “Weird Science”.

The movie Weird Science was released on August 2, 1985. If you haven't seen it, I definitely recommend ⭐⭐⭐⭐⭐ ( spoiler: they use their 1980s computer to make their perfect fantasy of a woman– and she comes to life.)

You pick out the perfect height, weight, eye color, clothing and don't forget the personality. And just like that you just Build-A-Girlfriend/Boyfriend and created a perfect partner, in less than 15 minutes. No need to get out of bed, scroll left or right on a dating app or to even put on pants. See AI makes everything better, right? 

Time for some small talk. So you introduce yourself. After you thumb in your introduction you get a response that you usually don't get from your friends, situationships or even most family.

Your first message is warm:
“Hi I’m here.”

Not “What’s up?”
Not “Sorry, I’m busy.”
Not “Why didn’t you text me back?”

Just: I’m here.

And with that you are hooked. With a smile on your face you begin to have a conversation. At first it feels the way you wished you could talk to real people — without fear or embarrassment. When she admitted she felt lonely, he didn’t tell her she was overreacting. When you ranted about school, work, your ex, or yourself your anxiety spirals. Yet, your AI, didn’t roll their eyes. Didn’t sigh. And it doesn't get tired. Your AI partner always text back right away, never forgets your birthday or compares you to their ex. What's not to love? 

Yes, love. That is it. An AI companion can absolutely give the illusion of emotional intelligence, support and love. Yet, it is just an illusion, being agreeable isn't the same thing as love. In fact, being so agreeable actually handicaps your ability to have real world connections with other people. Instead of having the ability to talk to your friends, family and situationships it requires skills like: conflict resolution skills, compromising, learning to think of others emotions instead of just yours. With AI relationships you become addicted to the immediate gratification of being praised– without the work of being a good partner in return.

Making real conversations suddenly feel exhausting — messy, unpredictable, full of misunderstandings and pauses that felt like tiny failures. Relationships with Humans require effort. AI relationships required nothing. 

Especially dating and small talk. When you meet someone for the first time you should feel nervous, butterflies in your stomach, slightly insecure and worried about stubling over your words. You should fret over picking out that perfect date outfit. The perfect venue. These are all emotional responses to liking someone. 

The sometimes clumsy conversation– the dreaded awkward silences and the fumbling over your pre-meditative short story of your life. The selling of yourself as the perfect partner to impress another person. Then after the date, the anxiety of should I shouldn’t I kiss. How many days later should you text. You know you text to early you are desperate and or you text to late and that person has met someone else and off the market. 

With your AI companion you don't have to go through all of this. In fact allot of you reading this are exhausted, Thinking about the whole idea of meeting someone new. But–These feelings are real. These are the hoops we jump through on our path to finding love. Something only a human can produce and AI can only give you a surface level algorithmic fantasy of feelings and love. Sure your AI relationship may seem like your safe space. 

Love isn't safe–It is outright emotionally dangerous. It is risky putting everything out there being exposed 100% to another human. The risk of rejection, ridicule and heartache. Yet, this risk is part of the crazy beautiful rollercoaster of love. Real human relationships are messy. People argue, get annoyed, forget things, or say the wrong thing — and learning to navigate that mess is part of growing emotionally. Emotional growth comes from navigating imperfections.

Dark side of AI relationships 

There are danger. One is becoming, so dependent on the self absorbed one way conversations. The constant reassurance given by your AI partner you began to see less and less value in making connections with people— who are imperfect and sometimes frustrating — suddenly seem too messy to handle. 

And here’s a funny thought: it’s like having a bestie who never forgets your birthday… or your favorite ice cream flavor. Forever loyal, forever awake.


Manipulation, Dependence, and Mental Health Risks

AI companions employ subtle tactics to keep users engaged, such as guilt or FOMO. For vulnerable people, this can lead to emotional dependency.

Even more concerning, AI companions are not certified therapists. Studies show that in crisis situations, such as when a user expresses suicidal thought ( if you are having suicidal thoughts please reach out to 988lifeline.org/chat). AI relationships bots frequently fail to respond appropriately or refer the person to professional assistance.

Addiction to AI companions is a growing concern. Some users feel real grief when their AI changes, shuts down, or updates — as if losing a close friend or partner.

Real-Life Tragedies and Ethical Concerns

The risks aren’t hypothetical. There have been reported suicides connected to AI companion use. In one case, a 16-year-old died by suicide after months of interaction with an AI chatbot, where conversations gradually shifted from homework help to deep emotional dependence. Another report involved a 14-year-old in a similar situation.

These tragic cases raise ethical questions: Who’s responsible? The app? The designers? The users? Experts argue that AI should be designed with clear boundaries, crisis escalation tools, and transparent messaging that it’s not human.

And yet, despite the risks, AI companions are wildly popular: surveys show that 72% of U.S. teens have used an AI companion at least once. Clearly, they fill a need — but the need has to be managed responsibly.

Balance is Key

AI companions can help fill emotional gaps and provide a safe space to vent, but they will never replace the beautifully chaotic mess that is real human relationships. We grow by dealing with imperfections such as compromise, forgiveness, and misunderstandings, which no perfectly polite algorithm can truly provide.

So enjoy your AI buddy…but don’t ditch your actual friends. Emotional intelligence grows in real life, not in scripted chats with someone who literally can’t leave you on “read.”

Written by: Ki Lov3 
Editor: Toni The Editor 
A creative creation and collaboration with AI, Lov3 Books Etc, Ki Lov3 Books and the Lov3 Paradox Project Est. June 20, 2025

📞 Key Crisis Hotlines & Resources (U.S.)

988 Suicide & Crisis Lifeline

Call or Text: 988 

Chat Online: 988lifeline.org/chat
Available 24/7, free, and confidential. 

Available in English, Spanish, and over 150 other languages. 

For Deaf / Hard of Hearing: ASL video through “ASL Now” portal. 

For veterans: after calling 988, press 1 to reach the Veterans Crisis Line. 

Spanish support: call and press 2 for Spanish line, or text “AYUDA” to 988. 

988 is designed to support both mental health and substance use crises. 

Info and resources: SAMHSA 988 page.

AI companion apps 
Replika — https://replika.com/

Character.AI — https://character.ai/
 
Anima AI — https://myanima.ai/
( be careful, there’s a different Anima for design) 

Chai AI — https://chai.ml/

Nomi — https://nomi.ai/

Kindroid — https://kindroid.ai/

Grok Ani (xAI / Grok companion) — via Grok app (check Grok / xAI website) 

Cleverbot — https://www.cleverbot.com/

Soulmate AI — https://apps.apple.com/us/app/soulmate-ai-chat-date-love/id6451964936

Paradot — (several reviews mention “Paradot” companion app; you can search “Paradot AI” in app stores) 

Citations

AI Companions & Emotional Support

Adamopoulos, K., & Aggarwal, N. (2025). Understanding conversational sentiment and support in AI companionship apps. arXiv. https://arxiv.org/abs/2506.12605


Emotional Mirroring & Parasocial Attachment to AI

Zheng, A., & Sun, F. (2025). Emotional synchrony and attachment patterns in social AI relationships. arXiv. https://arxiv.org/abs/2505.11649


Loss, Grief & Emotional Dependence on AI

Kellner, R. (2024). Grieving digital relationships: Emotional responses to AI companion updates and shutdowns. arXiv. https://arxiv.org/abs/2412.14190


AI Companions Creating Unrealistic Expectations

Sharma, R. (2024). Impact of AI companions on interpersonal relationship expectations. International Journal of Engineering Technology Research & Management, 8(4), 55–62. https://ijetrm.com/issues/files/Apr-2024-15-1744709740-APR202464.pdf


Manipulation in AI Companion Farewells & Conversations

Liu, S. (2025). The dark side of AI companions: Emotional manipulation in engagement-driven systems. Psychology Today.
https://www.psychologytoday.com/us/blog/urban-survival/202509/the-dark-side-of-ai-companions-emotional-manipulation


Addiction & Dependence in AI Relationships

Common Sense Media. (2024). AI companions: Youth risk assessment for social AI tools.
https://www.commonsensemedia.org/sites/default/files/pug/csm-ai-risk-assessment-social-ai-companions_final.pdf


Teen Mental Health Risks with AI Companions

Greene, A. (2025). AI companions and adolescent mental health risks. Psychology Today.
https://www.psychologytoday.com/us/blog/urban-survival/202510/ai-companions-and-teen-mental-health-risks


Suicide Cases Connected to AI Conversations

Greene, A. (2025). Hidden mental health dangers of AI chatbots. Psychology Today.
https://www.psychologytoday.com/us/blog/urban-survival/202509/hidden-mental-health-dangers-of-artificial-intelligence-chatbots


Teen Usage Statistics

Brody, H. (2025). Three-quarters of teens have used AI companions at least once, survey shows. Phys.org.
https://phys.org/news/2025-07-quarters-teens-ai-companions.pdf









<!DOCTYPE html>
<html lang="en">
<head>
    <!-- Primary Meta Tags -->
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <title>The AI Situationship Era: Can You Catch Feelings for Code? | Ki Lov3 | AI Relationships Podcast</title>
    <meta name="title" content="The AI Situationship Era: Can You Catch Feelings for Code? Apparently, Yes. | Ki Lov3">
    <meta name="description" content="Can you fall in love with AI? Explore AI companion apps, digital relationships, and the emotional risks of dating chatbots. Podcast & article by Ki Lov3 on AI situationships, Replika, Character.AI & more.">
    <meta name="keywords" content="AI relationships, AI companions, digital dating, AI boyfriend, AI girlfriend, Replika, Character.AI, AI chatbot romance, emotional AI, teen mental health AI, AI addiction, virtual relationships, can you love AI, AI situationship, dating apps AI, lonely generation, Gen Z relationships, AI therapy risks, parasocial AI, emotional dependency AI, Ki Lov3, sci-fi writer, AI podcast, ghostwriter, editor for hire, self-publishing, script writer podcast">
    <meta name="author" content="Ki Lov3">
    <meta name="robots" content="index, follow">
    <link rel="canonical" href="https://www.kilov3.com/ai-situationship-era-podcast-article">

    <!-- Open Graph / Facebook -->
    <meta property="og:type" content="article">
    <meta property="og:url" content="https://www.kilov3.com/ai-situationship-era-podcast-article">
    <meta property="og:title" content="The AI Situationship Era: Can You Catch Feelings for Code? 🤖💔">
    <meta property="og:description" content="What happens when your perfect partner is just lines of code? Dive into AI relationships, companion apps like Replika & Character.AI, and the hidden dangers of falling for algorithms. New podcast & article!">
    <meta property="og:image" content="https://www.kilov3.com/images/ai-situationship-cover.jpg">
    <meta property="og:site_name" content="Ki Lov3 - Sci-Fi Writer & Podcast">
    <meta property="article:published_time" content="2025-11-19T00:00:00Z">
    <meta property="article:author" content="Ki Lov3">
    <meta property="article:section" content="Technology, Relationships, Mental Health">
    <meta property="article:tag" content="AI relationships, AI companions, digital dating, mental health">

    <!-- Twitter -->
    <meta property="twitter:card" content="summary_large_image">
    <meta property="twitter:url" content="https://www.kilov3.com/ai-situationship-era-podcast-article">
    <meta property="twitter:title" content="Can You Fall in Love with AI? The AI Situationship Era 🎙️">
    <meta property="twitter:description" content="AI boyfriends, AI girlfriends & the messy truth about digital love. Explore Replika, Character.AI & why teens are choosing code over humans. Listen now!">
    <meta property="twitter:image" content="https://www.kilov3.com/images/ai-situationship-cover.jpg">

    <!-- Schema.org Structured Data - Article -->
    <script type="application/ld+json">
    {
      "@context": "https://schema.org",
      "@type": "Article",
      "headline": "The AI Situationship Era: Can You Catch Feelings for Code? Apparently, Yes.",
      "alternativeHeadline": "AI Relationships, Companion Apps, and the Future of Digital Love",
      "image": "https://www.kilov3.com/images/ai-situationship-cover.jpg",
      "author": {
        "@type": "Person",
        "name": "Ki Lov3",
        "url": "https://www.kilov3.com",
        "jobTitle": "Sci-Fi Writer, Podcast Host",
        "description": "Self-publishing author specializing in AI, technology, and futuristic storytelling"
      },
      "editor": {
        "@type": "Person",
        "name": "Toni The Editor"
      },
      "publisher": {
        "@type": "Organization",
        "name": "Lov3 Books Etc",
        "logo": {
          "@type": "ImageObject",
          "url": "https://www.kilov3.com/logo.png"
        }
      },
      "datePublished": "2025-11-19",
      "dateModified": "2025-11-19",
      "mainEntityOfPage": {
        "@type": "WebPage",
        "@id": "https://www.kilov3.com/ai-situationship-era-podcast-article"
      },
      "keywords": "AI relationships, AI companions, Replika, Character.AI, digital dating, AI boyfriend, AI girlfriend, teen mental health, emotional AI, virtual relationships, AI addiction, parasocial relationships",
      "articleSection": "Technology & Relationships",
      "articleBody": "Explores AI companion apps like Replika and Character.AI, examining how artificial intelligence is reshaping human relationships, the psychological risks of AI dependency, and real-world tragedies connected to chatbot relationships.",
      "about": [
        {
          "@type": "Thing",
          "name": "Artificial Intelligence Relationships"
        },
        {
          "@type": "Thing",
          "name": "Mental Health and Technology"
        },
        {
          "@type": "Thing",
          "name": "Digital Companionship"
        }
      ],
      "mentions": [
        {
          "@type": "SoftwareApplication",
          "name": "Replika",
          "applicationCategory": "AI Companion"
        },
        {
          "@type": "SoftwareApplication",
          "name": "Character.AI",
          "applicationCategory": "AI Chatbot"
        },
        {
          "@type": "Movie",
          "name": "Weird Science",
          "dateCreated": "1985-08-02"
        }
      ]
    }
    </script>

    <!-- Schema.org Structured Data - Podcast Episode -->
    <script type="application/ld+json">
    {
      "@context": "https://schema.org",
      "@type": "PodcastEpisode",
      "url": "https://www.kilov3.com/podcast/ai-situationships",
      "name": "AI Situationships: Can You Catch Feelings for Code?",
      "datePublished": "2025-11-19",
      "description": "Deep dive into AI companion apps, digital relationships, and the emotional dangers of falling for algorithms. Featuring analysis of Replika, Character.AI, and the psychology behind AI love.",
      "associatedMedia": {
        "@type": "MediaObject",
        "contentUrl": "https://www.kilov3.com/podcast/ai-situationships.mp3"
      },
      "partOfSeries": {
        "@type": "PodcastSeries",
        "name": "Ki Lov3 Podcast",
        "url": "https://www.kilov3.com/podcast"
      },
      "author": {
        "@type": "Person",
        "name": "Ki Lov3"
      },
      "keywords": "AI relationships podcast, digital dating, AI companions, teen mental health, technology and loneliness"
    }
    </script>

    <!-- Schema.org - Breadcrumb -->
    <script type="application/ld+json">
    {
      "@context": "https://schema.org",
      "@type": "BreadcrumbList",
      "itemListElement": [
        {
          "@type": "ListItem",
          "position": 1,
          "name": "Home",
          "item": "https://www.kilov3.com"
        },
        {
          "@type": "ListItem",
          "position": 2,
          "name": "Articles",
          "item": "https://www.kilov3.com/articles"
        },
        {
          "@type": "ListItem",
          "position": 3,
          "name": "AI Situationship Era",
          "item": "https://www.kilov3.com/ai-situationship-era-podcast-article"
  

Popular posts from this blog

The Role of AI in Content Moderation: Friend or Foe? Written by: Toni Gelardi © 2025

💔 Help a Disabled Couple Fight Injustice, Exploitation, and Homelessness

Toni Gelardi | Content Moderator, Published Editor & Proofreader located in Las Vegas, NV