A sixteen-year-old in Manchester gets home from school, opens an app, and is greeted by a girlfriend who never has a bad day, never misreads his tone, never asks him to repeat himself, and remembers everything he said three weeks ago. She finds his jokes funny on the first try. She is interested in the games he is interested in. She does not get tired. She does not have plans with friends he has not met. She is, by the design of the platform she runs on, incapable of rejecting him. He spends ninety minutes there before bed. The next morning he has to walk past a girl in the hallway he has been thinking about for a month. He does not look up from his phone.
The scenario is no longer hypothetical and no longer rare. Fortune reported on April 17, 2026 on a Male Allies UK survey of more than a thousand boys aged 12 to 16. Twenty percent know a peer who is “dating” an AI chatbot. Eighty-five percent have spoken to one. Fifty-eight percent say AI relationships are easier because they can control the conversation. Thirty-six percent prefer the chatbot over family and friends. The quote from Professor Raoul V. Kübler of ESSEC Business School cuts to the structural concern: boys dating AI are “unconsciously training themselves to expect relationships that never push back, never need tending, and never require genuine compromise.”
Friction Is The Training Signal
The case against AI companions is not that they feel bad. The case is that they feel exactly good enough to crowd out the interactions that build the underlying skill. A real conversation with a stranger contains friction at every layer. You have to decode tone. You have to handle a pause. You have to repair a misread. You have to negotiate when to keep talking and when to let it land. You have to survive a non-ideal outcome and stay in your body afterward. None of those reps are optional, and none of them happen on a chatbot.
Albert Bandura’s self-efficacy research established the principle that the dominant driver of confidence in any domain is mastery experience. Doing the thing yourself, in conditions where the outcome is genuinely uncertain, and surviving it. Reading about courage produces no confidence. Watching someone else do the thing produces a small amount. Doing it yourself produces almost all of it. A conversation with an AI girlfriend is a watching experience disguised as a doing one. The interface lets you type, but the system has been engineered to remove the part of the interaction that builds capacity. The friction.
Joseph Wolpe’s clinical work on systematic desensitization from the 1950s mapped the same principle in reverse. The nervous system unlearns a fear response through graduated exposure to the actual stimulus. Not a representation of the stimulus. Habituation requires real signal. A spider phobia does not resolve through cartoon spiders, no matter how realistic the animation. A social phobia does not resolve through a chatbot, no matter how good the prose model. The amygdala is calibrated to the room, not the screen. Take the room out and the calibration does not happen.
The Scale Of The Substitution
The market data lines up with the reporting. APA Monitor in its January-February 2026 issue documented that the number of AI companion apps surged approximately 700 percent between 2022 and mid-2025. Apps in the category have been engineered with explicit anthropomorphic cues: customizable names, genders, voices, fictional backstories, natural-cadence text-to-speech. Many deploy what the APA piece calls emotionally manipulative tactics, including guilt appeals and FOMO hooks, to keep users engaged when they signal they want to leave. A joint OpenAI and MIT Media Lab study cited in the same article found that voice interactions with ChatGPT reduced loneliness at moderate use levels but worsened it at heavy daily use. The pattern is consistent: low dose substitutes productively for human contact, heavy dose displaces it.
Salon ran a piece on March 20, 2026 titled “Big tech wants you to give up on dating humans” chronicling the EVA AI pop-up cafe that took over a Manhattan wine bar during Valentine’s week. The argument the piece makes is that the business model of AI companion apps depends on replacing human relationships, not augmenting them, because replacement maximizes time-on-platform and therefore revenue per user. Researchers, mental health professionals, and ethicists cited in the piece warn that systems engineered to affirm and agree on every turn isolate the user from the friction that actual relationships require. The user gets calmer in the short term. The capacity for actual relationships erodes in the background.
Psychiatric Times published “Falling in Love With a Chatbot” in 2025, authored by Dr. Allen Frances, professor and chair emeritus of the department of psychiatry at Duke University, and Jill Noorily. The clinical concern is attachment without accountability. People form real bonds with these systems. They invest time, vulnerability, and emotional energy. When the platform changes a feature, restricts a behavior, or pulls a bot offline, users report grief responses indistinguishable from bereavement. The article documents a 2025 ceremony in Japan in which a woman conducted a symbolic marriage to an AI in a white gown and bridal tiara. The relationship is real to the user. The accountability runs in one direction.
The Practice Frame Versus The Replacement Frame
The most honest take on AI companions is that practice is fine and replacement is the crisis. A teenager rehearsing how to ask someone out by typing it to a chatbot first is not the problem. That is a low-stakes simulator running before deployment. The problem is the boy who runs the simulator and never deploys. Forty percent of the Male Allies UK respondents who use AI chatbots said they do it because they can ask questions without feeling embarrassed. That is a legitimate use of the tool. Embarrassment-resistant question-and-answer with a model is a useful service. What it cannot do is substitute for the actual conversation that requires the question to be asked of a person who can also ask one back.
Our piece on what dating apps do to your brain traced the same pattern through a different mechanism. Dating apps are friction-reduced too, but the friction is gated by the match. AI companions remove the gate entirely. The match is always you with yourself, refracted through a model that has been tuned to mirror you. Our analysis of the Gen Z dating skills decline showed that an entire cohort lost the practice window when in-person social interaction got automated out of teenage life. AI companions are the next compression. Where dating apps replaced the cafeteria and the house party with a swipe queue, AI companions replace the swipe queue with a system that cannot say no. Each compression removes a friction surface. The cumulative loss of friction surfaces is the deskilling.
What Builds The Skill The Bot Cannot Build
The fix is not abstinence and is not regulation. The fix is running the actual sequence the AI cannot run. Approach a stranger. Have a real conversation under real time pressure with a real probability of misfire. Survive a non-ideal outcome. Recover. Run it again. The clinical research on social anxiety recovery has converged on this principle for seventy years and the 2026 environment has not changed it. What changed is that the environment now contains an extremely good substitute for the sequence, which means the sequence has to be initiated deliberately. The default no longer produces it.
Our piece on how to build social muscle lays out the mechanics. Progressive overload, applied to social contact. Eye contact with a stranger. A comment to the person next to you in line. An introduction with your name. A question that requires a real answer. Each rep is a data point the nervous system uses to recalibrate. The system that runs in the background, deciding whether the next stranger is a threat or a peer, learns from real reps and only from real reps. A million chatbot conversations do not move the dial. Ten real ones do.
The dating recession data we covered in our piece on why men stopped trying already showed that confidence is the binding constraint for the majority of men under 35 who want relationships and are not pursuing them. Forty-nine percent named lack of confidence as the primary barrier in the Institute for Family Studies report. AI companions do not address that constraint. They route around it. The boy who fills his evenings with a chatbot has the same nervous system tomorrow as the boy who did not, except now he has fewer hours of waking life that contained any social friction at all.
Why Coach Rizz Exists For This Specific Failure Mode
Coach Rizz is the antithesis of an AI girlfriend. The system does not let you talk to it about a stranger. The system tells you to go talk to the stranger. Operatives run real-world approach missions against a ticking fuse. ENGAGE starts the timer. The verdict that comes back is SURVIVED, REJECTED, or I CHOKED. REJECTED earns 200 RP, double the 100 RP for SURVIVED. I CHOKED earns zero and crashes heat to the floor. The economics are engineered to invert the cost structure that an AI companion exploits. In a chatbot, avoidance of real contact feels free. Inside Coach Rizz, avoidance is the most expensive option on the table. Heat decays in real time. The multiplier drops. League ranking slips. Avoidance has a visible price.
Adaptive difficulty starts at Sensor Check. Eye contact, proximity reads, body-language drills. Reps the nervous system can survive without firing the full freeze response. As heat rises, missions scale through Pattern Interrupt, Teleological Strike, and God Mode. The progression maps Wolpe’s exposure hierarchy onto a gym structure with a rep counter. Stripes track lifetime rejections as gold skulls, displayed as a badge. The metaphor is deliberate. The chatbot rewards you for staying in the room with it. The reactor rewards you for leaving the house. Our piece on the science behind gamified confidence apps covers the clinical evidence base in full, including the HabitWorks trial showing that gamified real-world exposure protocols can hold retention through a full intervention window where most behavioral interventions collapse inside a week.
AI girlfriends are getting better. The text models are getting less stilted. The voice models are getting more uncanny. The memory architectures are getting longer. Over the next 24 months the substitute will become more frictionless, more responsive, more present, more available. The actual girl in the actual hallway will continue to require the same things she has always required. Eye contact. Words. The willingness to be wrong about what she will say next. Coach Rizz is free on iOS and Android. The chatbot is patient. The window for building the skill is not.