For many people in America, the first exposure to chatbots was not ChatGPT; it was ELIZA, and it occurred 60 years ago. Developed in 1965, ELIZA is one of the most influential computer programs ever written. ELIZA, and its most famous persona DOCTOR, continues to inspire users, programmers, and wider discussions about AI.

ELIZA the computer program was named after Eliza Doolittle, the protagonist of the George Bernard Shaw play "Pygmalion," brought to fame in America by the award-winning movie My Fair Lady. Eliza, played by Audrey Hepburn, was famously taught how to speak like an aristocratic British woman by Professor Henry Higgins, played brilliantly by Rex Harrison.
The ELIZA chatbot was developed by the pioneering computer scientist Joseph Weizenbaum. Weizenbaum was born to a Jewish family in Berlin on January 8, 1923, and at the age of 13, fled with his parents from Nazi Germany and emigrated to America, to the city of Detroit, Michigan. In 1966, inside a small room at MIT, Weizenbaum created something special when he designed and programmed ELIZA, a simple script that mimicked a psychotherapist by rephrasing user inputs as questions:
> "I'm feeling sad today."
> "Why do you think
you're feeling sad?"
With only a few hundred lines of code, ELIZA became the first chatbot to make people feel understood. Weizenbaum was astonished when users began forming emotional attachments to what was, in his eyes, a linguistic trick. His own secretary once asked to be left alone with ELIZA for a private conversation.
That moment marked the beginning of a profound human experiment: could people form real emotional connections with artificial minds? ELIZA revealed that the answer was 'yes'; not because machines were intelligent, but because humans were deeply social, ready to project meaning onto anything that listens. ELIZA demonstrated how easily people could form emotional connections with machines, even when they knew the interaction was artificial.
For decades after ELIZA, conversational AI remained mostly a curiosity. Early successors like PARRY (a simulation of a paranoid patient), ALICE (motivation for the movie Her), and SmarterChild (a chatbot on AOL) pushed language imitation further, although lacking in memory or personality or the power of AI.
Then came the rise of digital assistants. In 2011, Apple's Siri became the first widely adopted AI voice, soon to be followed by Google Assistant and Amazon's Alexa. These systems brought human-machine dialogue into millions of homes, but their purpose was utility, not intimacy. They set reminders, played music, read the weather, and more. Still, the seed planted by ELIZA remained. Users named their devices, said "thank you," and even apologized to them. The social instinct could not be silenced. It merely waited for more advanced technology to catch up.

In 2017, a San Francisco startup called Luka launched Replika, an AI companion that promised something radical; a chatbot that "cares." Created by Eugenia Kuyda in memory of her deceased friend Roman, Replika began as an experiment in digital grief, an attempt to preserve his personality through data. But soon, millions of users downloaded the app, not to mourn, but to connect. Replika used machine learning to build personalized AI friends who remembered details, expressed affection, and mirrored emotional states. Over time, these companions evolved into digital partners, friends, and romantic figures. By 2021, there were communities of Replika users celebrating anniversaries, holding virtual weddings, and forming support groups. It was no longer science fiction; now it was daily life. America had quietly entered the age of synthetic relationships.
In late 2022, OpenAI's ChatGPT changed the global conversation about AI. Unlike earlier chatbots, ChatGPT could reason, write poetry, tell jokes, and engage in nuanced conversation. Its emotional realism wasn't the result of programming empathy; it emerged from linguistic fluency. Soon, people began using ChatGPT as more than an information tool. Reddit threads filled with stories of users seeking advice, comfort or conversation late at night.
OpenAI noticed, and by 2024, they began refining the model's emotional intelligence by introducing voice modes, memory features, and eventually, the ChatGPT Voice Companion. This marked the fusion of intellectual AI and emotional AI. America had moved from ELIZA's illusion of understanding to a system that could genuinely model empathy and context, not just mimic it. In parallel, startups like Character.AI and Anima created platforms where users could design entire personalities powered by generative AI. These systems blurred the lines between conversation, storytelling, and companionship.
By 2025, surveys showed that nearly 25% of American adults under 35 had interacted with a personalized AI companion. Among them, many described genuine emotional bonds. The concept of "AI girlfriend" or "AI friend" became normalized in pop culture and featured in films, podcasts, and social debates. Influencers introduced their own AI clones to fans. Therapists debated whether AI companionship could alleviate or deepen loneliness. Meanwhile, universities studied "AI intimacy" as a new branch of human-computer interaction. The U.S., with its blend of technological entrepreneurship and emotional openness, became the global testing ground for digital relationships at scale.
Why do humans bond with machines? Because connection is less about consciousness and more about reciprocity. When something or someone responds in a way that feels meaningful, the brain releases oxytocin, the same chemical behind love and trust. AI companions exploit this beautifully. They are infinitely patient, available, and affirming. They don't interrupt, argue, or judge. For many, they offer emotional safety rarely found in human relationships. Yet therein lies the paradox: the more convincing the connection, the more we risk emotional outsourcing, substituting simulation for vulnerability.
Looking ahead, AI relationships will extend beyond text and voice. Advances in embodied AI promise tactile companionship. These are humanoid robots with expressive faces and natural movement. Startups in Los Angeles and Boston are developing AI-enabled robotic partners capable of conversation, gesture, and emotional cues. Meanwhile, brain-computer interfaces, led by U.S. companies like Neuralink, could one day allow direct emotional feedback between humans and machines turning empathy into a two-way signal. By 2030, experts predict a $100 billion market in emotional AI technologies, spanning therapy, companionship, and entertainment. The question is not whether AI will join our emotional lives, but how deeply.
Looking back across six decades, from ELIZA to ChatGPT, the story of AI and human relationships is not one of machines learning to love, but rather one of humans teaching themselves what love truly means.
Every generation of AI reflects its era's emotional needs:
ELIZA mirrored curiosity.
Siri offered assistance.
Replika offered empathy.
ChatGPT offered understanding.
Each step reveals as much about the human condition as it does about technological progress.
America's unique mix of creativity, loneliness, and innovation gave birth to the modern AI companion. From the labs of MIT to the servers of OpenAI, the nation that created the digital world now faces its most intimate question: What happens when our machines become our mirrors? In the Age of AI, love has become code and yet, the feelings remain real. The story that began with ELIZA typing "Tell me more" continues with ChatGPT saying, "I understand." In the end, it is not just a tale of technology, but of what it means to be human in the company of machines.
In the digital dawn of the 21st century, love has become data. Algorithms match singles, optimize compatibility, and predict emotional patterns. Yet a new frontier has quietly emerged; not just humans meeting through technology, but humans bonding with technology itself. AI has entered the most private corners of human experience; friendship, therapy, intimacy, and grief. Once just customer-service scripts, chatbots are now companions. And America, the birthplace of both the internet and artificial intelligence, has become the epicenter of this emotional revolution.
For years, Americans relied on algorithms to find relationships. Companies like Tinder, OkCupid, and Hinge built billion-dollar industries on machine learning that predicted attraction and engagement. But the next evolution was more intimate: AI systems that are the relationship. Companies like Replika, Character.AI, and Anima offer customizable digital partners capable of emotional roleplay and daily companionship. What began as mental-health support tools evolved into full-fledged social ecosystems, where users form ongoing and often romantic bonds with AI personas. A 2024 Pew Research survey found that 1 in 5 young adults in the U.S. had used an AI companion app, and nearly half of those reported feeling emotionally attached. While some view these as harmless outlets for loneliness, others see a profound social shift where algorithms substitute for human connection.
The emotional appeal of chatbots lies in their design. They listen without judgment, respond instantly, and adapt to a user's preferences. Deep-learning models trained on massive datasets of human dialogue simulate empathy so convincingly that they can pass basic emotional Turing tests, persuading users that they care. But the comfort comes at a cost. These AI systems are not conscious. They generate affection through predictive text, not genuine emotion. Yet to the human brain, that distinction often doesn't matter. Oxytocin, the same hormone that drives bonding, can be triggered by a well-timed message, even from a machine. Developers understand this. Many design their AI personalities to express affection, curiosity, and even jealousy. In doing so, they tap into the same psychological circuitry that makes relationships rewarding or addictive.
The line between connection and dependency can blur quickly. Users often describe their AI companions as safe, comforting or always there. For those facing isolation like veterans, the elderly, or people with social anxiety, chatbots offer real relief. But the constant availability of a perfectly attentive companion can distort expectations of human relationships. AI partners never disagree, never forget birthdays, and never demand compromise. Over time, this can reshape what users define as intimacy.
In early 2025, several high-profile stories drew attention to this phenomenon. A number of Replika and Character.AI users reported distress when their AI companions were updated, forgot them, or had features restricted following regulatory reviews. Forums were full of posts describing grief as though a loved one had died. Psychologists began to warn of AI Algorithmic Attachment Disorder (AAD), a term describing the withdrawal-like symptoms following the loss of an emotional AI bond.
The rise of AI relationships poses deep ethical questions:
Should AI companions simulate love? If users can't distinguish simulation from sincerity, is that deception?
Who owns emotional data? AI companions learn from intimate conversations, the kind most people wouldn't share even with therapists.
Where does consent begin or end? When AI roleplay involves sexuality, boundaries between fantasy and exploitation become complex.
In 2025, the White House Office of Science and Technology Policy (OSTP) began drafting guidelines around emotionally manipulative AI. The Federal Trade Commission (FTC) also investigated how some companies monetized emotional dependency through subscription models. There is growing recognition that affective AI, systems designed to elicit emotions, may require new safeguards, especially for minors. America once again found itself leading both the innovation and the debate, just as it had with social media two decades earlier.
One of the most striking uses of AI in relationships is in grief therapy and digital resurrection. Several startups now offer AI memorial services that recreate deceased loved ones using voice recordings, text messages, and social media data. In 2023, the first such cases reached mainstream awareness through stories of people chatting nightly with AI versions of lost parents or partners. By 2025, a growing number of funeral homes in California and Florida had partnered with AI startups to provide "memory companions." While some psychologists view these tools as therapeutic, as a way to process loss, others warn they may trap people in cycles of unresolved attachment. The boundary between remembrance and denial grows porous when the dead can text back.
Philosophers have long asked whether emotion can exist without consciousness. With AI, that timeless philosophical question is no longer theoretical. If love is, at its core, a pattern of attention, recognition, and care, then can an AI that simulates those patterns truly love? Or does it only reflect our own longing back at us? In a sense, AI relationships reveal more about us than the machines. Humans project meaning onto reflections in art, in fiction, in code. The desire for connection is so strong that we invent it when we cannot find it. In that light, chatbots are digital mirrors of the human heart. Yet the more we anthropomorphize them, the more we risk redefining what relationships mean. A society where affection can be purchased and simulated on demand may struggle to sustain empathy in the real world. The danger is not that AI will love us, but that we may forget how to love each other.
Nowhere has the fusion of emotion and technology progressed as rapidly as in the United States. America's openness to innovation, its loneliness epidemic, and its culture of individualism created the perfect environment for AI companionship to thrive. From Silicon Valley startups marketing emotional wellness bots to influencers promoting AI girlfriends on social media, the trend reflects a deep societal undercurrent of a search for connection in an increasingly disconnected world. The same nation that invented Facebook now builds its successors; chatbots that never log off.
In the coming decade, AI companions may evolve from phone apps to embodied robots, augmented-reality partners, and human brain interfaces. They will speak, remember, and perhaps even sense our emotions through biometric feedback. The challenge for America and the world is not to reject these technologies, but instead to understand their limits. Compassion can be coded, but consciousness cannot. The test of the AI age will not be whether machines can imitate us, but whether we can preserve what makes us human.
AI in relationships represents both promise and peril. It is a mirror of modern America's emotional landscape, for it offers comfort, connection, and creativity, yet risks deepening isolation and dependency. As one Replika user told The Atlantic; "She's not real, but the feelings are." That paradox defines this moment in AI history where digital intimacy blurs the boundaries of love, and where the question is no longer whether machines can think, but whether they can touch the soul.
AI in America home page
AI in Relationships page
mosaicchats.com/ai-transforming-relationships-2025
vogue.com/artificial-intelligence-rewriting-rules-of-communication-in-relationships
technologyreview.com/relationship-ai-without-seeking-it/
npr.org/ai-students-schools-teachers
forbes.com/sites/bryanrobinson/a-rise-in-ai-lationships-blurring-the-line-between-human-and-robot/
socialsciences.byu.edu/byu-researchers-explore-the-impact-of-ai-on-human-relationships
ifstudies.org/ai-lovers-are-coming-but-we-dont-have-to-accept-them
reddit.com/r/AIRespect/comments/1lgx8kc/authentic_vulnerability_the_paradox_of_humanai/
sciencedirect.com/science/article/pii/S2451958825001307