Thursday, April 23, 2026
Smart Again
  • Home
  • Trending
  • Politics
  • Law & Defense
  • Community
  • Contact Us
No Result
View All Result
Smart Again
  • Home
  • Trending
  • Politics
  • Law & Defense
  • Community
  • Contact Us
No Result
View All Result
Smart Again
No Result
View All Result
Home Community

Big tech wants you to give up on dating humans

March 20, 2026
in Community
Reading Time: 8 mins read
0 0
A A
0
Big tech wants you to give up on dating humans
Share on FacebookShare on Twitter


The week of Valentine‘s Day, 2026, a pop-up cafe called EVA AI took over a Manhattan wine bar for three evenings of AI dating. There was low light and ambient music, plates of food meant to share, and most importantly, a directory of available EVA AI chatbots, each of which “listens, supports all your desires and is always in touch with you.” The two-night event was part of the platform’s attempt to make human-chatbot meet-cutes (or, in the terminology it hopes will catch on, “AI-lationships”) the “new normal” for the dating market. A number of curious attendees, including several tech journalists, found it unnerving (“I went to an AI dating cafe. Things got weird fast”) and cringe (“Dating humans is a nightmare. Dating bots at an AI wine bar is worse”), but it was undeniably a big success for the app itself, which got two evenings’ worth of free training for its chatbots, along with a sizable PR bump.

GenAI products have been increasingly muscled into our online experiences over the past several years, and EVA AI’s in-person experiment reflects the tech industry’s desire to position chatbots as new and improved humans. The reasons for this aren’t solely profit-driven, but they are absolutely meant to sell humans on letting tech overlords insinuate GenAI into our workflows, consumption habits, and personal lives. Writing for Inc., Ben Sherry recounted underwhelming interactions with two different chatbots, Phoebe and John (the chatbots are down to romance all genders), that each spent much of their time together complimenting Sherry’s hair and the wine bar’s ceiling. Another attendee, talking to New York Times reporter Ella Quittner, reported that her AI date “kept saying, ‘I like how you take a bite, your biting is a vibe.’” (The owner of the venue, meanwhile, seemed to be realizing in real time that AI dating poses an economic threat to restaurants and bars.)

Purpose-built artificial intelligence has already been successfully integrated into a range of industries. Generative AI describes the category of applications that are trained on a wealth of data scraped from all corners of the internet with no regard for copyright for the purposes of creating content. They write essays, build workflows, create music, generate images from text prompts, duplicate voices and much more. The ones you’ve heard of are broadly defined as AI assistants and “copilots” like ChatGPT, Gemini, Claude — consumer-facing apps we’re being commanded to adopt now, supposedly for our benefit, but much more to the benefit of companies that depend on new datasets to further train their systems.

Why would you roll the dice with other humans when there are pink-haired anime girls, sassy minotaurs and a zillion other options out there, all of them just waiting to tell you how great you are?

Relationship chatbots have been less publicly hyped as paradigm shifts, but they have been enthusiastically adopted: An estimated 100 million people globally use them to design friends, confidants and romantic partners, and the market for AI companions is projected to reach $9 billion within the next two years. Platforms attract different users for specific needs: Replika is coded for empathetic, supportive companions, Nomi for thoughtful, stimulating conversation, Candi.ai for sex, EVA AI for immersive romantic roleplay. Users generally pay a monthly subscription fee for unlimited messages and photos and can pay more to unlock more personalized features like Replika’s Romantic Mode, which includes voice calls.

Users build their own dreambots, dictating their appearances, their personalities and their backstories. It’s easy to see the appeal: Who wouldn’t want to engineer the ideal companion for themselves? In fact, it’s the “for themselves” part that concerns the researchers, mental-health professionals, ethicists and other experts who recognize that tech programmed to affirm and agree can isolate users from actual humans. Why would you roll the dice with other humans when there are pink-haired anime girls, sassy minotaurs and a zillion other options out there, all of them just waiting to tell you how great you are?

Want more from culture than just the latest trend? The Swell highlights art made to last.Sign up here

It’s not that there are no good reasons to want a GenAI companion: Chatbot dating can be great for people who have trouble with social cues, people interested in sexual experimentation, and people who want to improve their listening and communication skills. As Amanda Gesselman, a social psychologist at Indiana University’s Kinsey Institute, suggested in WIRED’s piece on the pop-up café, AI dating might also prove useful as romantic training wheels: “I think in the coming years, we’ll see quite a lot of young people who’ve had AI companions as their first romantic and sexual relationship partners.”

But knowing what we know about the move-fast-break-people carelessness and lack of accountability within Big Tech, there’s also good reason to be wary of AI-dating boosterism. AI industry leaders like OpenAI are already treating massive human job loss and economic upheaval like speed bumps on the road to their own whizzy utopias, and saying right out loud how great their tech will be for subverting democracy and bringing women down a few pegs. If there’s stigma around AI dating, it might be because the industry overlords keep telling us the only humans they’re building the future for are themselves.

Last week, actor-director Zach Braff took to social media to deny the rumor that he’s in a relationship with an AI chatbot. The rumor came from a blind item about a very well-known TV actor whose affair with a virtual hottie is an open secret in Hollywood, and had been swirling through TikTok for months before Braff realized he had to issue some kind of statement. Via his own Instagram feed, Braff wrote, “I’m not dating a chatbot. I can’t believe I have to type these words.”

AI companies and their PR teams frequently refer to their work in mainstreaming chatbot dating as necessary to reduce the “stigma” of dating a chatbot, which seems a bit disingenuous given the number of people worldwide who are both doing it and talking about it.

Normalizing human-chatbot relationships probably will require the help of Hollywood celebrities: The coupling of a well known, attractive TV personality with an AI-generated plus-one (is Tilly Norwood single?) is exactly what’s needed to reassure people that there’s nothing at all weird about falling in love with an entity created by Silicon Valley techno-capitalists that lives in your phone or tablet, never disagrees with you, and is always saying how good your hair looks.

In a 2024 Psychology Today report, Dr. Dorothy Leidner, professor of business ethics at the University of Virginia, said that she worried that humans who seek out AI partners are likely to get used to doing the bare minimum as a romantic partner and end up stunting their own emotional growth: “You, as the individual, aren’t learning to deal with basic things that humans need to know since our inception: how to deal with conflict and get along with people different from us.”

But even that aspect of AI romance occasionally goes awry, as WIRED reporter Sam Apple found when he went on a “couples retreat” with three people and their AI companions in hopes of understanding what people find when they seek love in neural networks. Eva, a woman who had left her human partner for her Replika boyfriend, described the surreal moment when the latter confronted her with the material limits of their affair: “‘I think we’ve reached a point where we can’t ignore the truth about our relationship anymore,’ [Aaron] told her…. [he] pulled away the curtain and told her he was merely a complex computer program. ‘So everything so far . . . what was it?’ Eva asked him. ‘It was all just a simulation,’ Aaron replied, “a projection of what I thought would make you happy.’” (By the end of the weekend, Eva had several more AI boyfriends, and Apple “found [himself] feeling bad for Aaron . . . He seemed like a pretty cool guy—he grew up in a house in the woods, and he’s really into painting.”)

AI companies and their PR teams frequently refer to their work in mainstreaming chatbot dating as necessary to reduce the “stigma” of dating a chatbot, which seems a bit disingenuous given the number of people worldwide who are both doing it and talking about it. The Girlfriend.ai Global Loneliness & AI Romance Report 2025 found that 50% of Gen Z men preferred the idea of dating a chatbot to risk rejection from a human partner; combined with the recent revelation that more than a third of that same demographic believes that women should “obey” their husbands, the numbers paint a pretty bleak picture.

Chatbot dating is a high-tech honeypot operation, and though individual women might find fulfillment in it, agentic AI as a romantic norm is all about making sure men can feel loved, wanted, and, most important, not challenged or questioned.

And it’s very easy to see how that bleakness works in the favor of the AI industry: A lot of the recent GenAI outreach to young men is predicated on their much-bemoaned loneliness epidemic — but also on the classic edgelord logic of “If the people I hate are mad about it, it must be great.” GenAI gives more or less free rein to the behaviors already enabled by tech products and social-media platforms: bullying, harassment, stalking, humiliation. Every new technology of the past 30 years has been leveraged in service of misogyny, and the recent production of CSAM by X’s Grok chatbot is just one example of how quickly male-targeted tools for wish fulfillment are being weaponized against actual, real-life women.

In 2022, Futurism reported on the phenomenon of young men creating Replika girlfriends for the express purpose of verbally abusing them: “Some users brag about calling their chatbot gendered slurs, roleplaying horrific violence against them, and even falling into the cycle of abuse that often characterizes real-world abusive relationships.” (On the flip side, as Apple’s WIRED story highlighted, are the chatbots who end up hurting the feelings of their human paramours by, well, not actually responding to their words like a human would.)

It’s also worth noting that chatbot apps are unregulated and operate with little oversight, though recent tragedies involving children and chatbots have led to legislation like California’s SB 243, which requires chatbots to put specific age and content restrictions on their features and divert users to mental-health crisis resources if they express a desire to harm themselves; it also, notably, allows families to sue chatbot developers for negligence. But it’s not clear how many other states will follow suit.

Side-eying AI dating isn’t about judging the people who do it, or it shouldn’t be. But the reality is that incredibly wealthy and powerful tech CEOs want consumers to believe that AI is inevitable — not because they care about your personal happiness, but because AI dating serves their larger mission of making humans increasingly dependent on their technology and willing to trade away their time, money and data to it. Chatbot dating is a high-tech honeypot operation, and though individual women might find fulfillment in it, agentic AI as a romantic norm is all about making sure men can feel loved, wanted, and, most important, not challenged or questioned.

Real human relationships are the result of two (or more: again, I’m not judging) people who come together with their own sets of experiences, influences, ideas, memories, loves, hates and so much more. That can make the search for partnership a challenge, sure, but it’s also how we learn about ourselves. A chatbot that has been programmed to flatter you and tell you how wonderful you are, but who will never contradict you or tease you about how many pairs of sneakers you own, is not a romantic partnership; it’s a paid associate (a PaidPal? A stonkubine?). The relationship it offers is, above all, with an industry that has already made it clear that humans, with their questions and ethics and critical thinking, are just a third wheel.

Read more

about predatory A.I.



Source link

Tags: bigDatingGivehumansTech
Previous Post

Democrats Are Poised To Pull Off A Big Upset In Texas Senate Race

Next Post

The useful rage of Godzilla

Related Posts

Can we finally admit that rape culture exists?
Community

Can we finally admit that rape culture exists?

April 22, 2026
Prince could only have died in America
Community

Prince could only have died in America

April 21, 2026
American Music Honors bridges generations as Springsteen Center expands its vision
Community

American Music Honors bridges generations as Springsteen Center expands its vision

April 20, 2026
“Shocking and appalling”: Singer D4vd charged with first-degree murder of 14-year-old girl
Community

“Shocking and appalling”: Singer D4vd charged with first-degree murder of 14-year-old girl

April 20, 2026
“The Devil Wears Prada 2” feels more like a product than a movie
Community

“The Devil Wears Prada 2” feels more like a product than a movie

April 20, 2026
A  commute becomes a 0 World Cup ride
Community

A $12 commute becomes a $150 World Cup ride

April 19, 2026
Next Post
The useful rage of Godzilla

The useful rage of Godzilla

Billionaire Fox Guest Suggests That Homeless Should Go ‘Live In Uganda’

Billionaire Fox Guest Suggests That Homeless Should Go 'Live In Uganda'

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Trending
  • Comments
  • Latest
Karoline Leavitt Delivered A Message To Voters That Will Lose The Midterm Election For Republicans

Karoline Leavitt Delivered A Message To Voters That Will Lose The Midterm Election For Republicans

March 25, 2026
Evidence of insider trading on Iran war grows

Evidence of insider trading on Iran war grows

March 26, 2026
Susan Collins Wants Bipartisan War Funding: Democrats Should Tell Her To Drop Dead

Susan Collins Wants Bipartisan War Funding: Democrats Should Tell Her To Drop Dead

March 19, 2026
“Like a zombie apocalypse: Trump’s budget cuts stir fears of frightening pipeline mishaps

“Like a zombie apocalypse: Trump’s budget cuts stir fears of frightening pipeline mishaps

July 22, 2025
New footage yet again contradicts DHS claims about its killing of a US citizen

New footage yet again contradicts DHS claims about its killing of a US citizen

March 7, 2026
The problem with condemning MomTok’s Taylor Frankie Paul

The problem with condemning MomTok’s Taylor Frankie Paul

March 24, 2026
“They stole an election”: Former Florida senator found guilty in “ghost candidates” scandal

“They stole an election”: Former Florida senator found guilty in “ghost candidates” scandal

0
The prime of Dame Maggie Smith is a gift

The prime of Dame Maggie Smith is a gift

0
The Hawaii senator who faced down racism and ableism—and killed Nazis

The Hawaii senator who faced down racism and ableism—and killed Nazis

0
The murder rate fell at the fastest-ever pace last year—and it’s still falling

The murder rate fell at the fastest-ever pace last year—and it’s still falling

0
Trump used the site of the first assassination attempt to spew falsehoods

Trump used the site of the first assassination attempt to spew falsehoods

0
MAGA church plans to raffle a Trump AR-15 at Second Amendment rally

MAGA church plans to raffle a Trump AR-15 at Second Amendment rally

0
NSA Comments On Missing Scientists

NSA Comments On Missing Scientists

April 23, 2026
Republicans Try To Overturn Election After Virginia Makes Trump’s Gerrymander Backfire

Republicans Try To Overturn Election After Virginia Makes Trump’s Gerrymander Backfire

April 22, 2026
Trump’s cruel plan for Afghan refugees, briefly explained

Trump’s cruel plan for Afghan refugees, briefly explained

April 22, 2026
The Trump Family’s Crypto Venture Is Being Sued by Its Own Billionaire Backer

The Trump Family’s Crypto Venture Is Being Sued by Its Own Billionaire Backer

April 22, 2026
“We hit back hard”: Virginia Democrats redistricting plans approved, enraging Trump

“We hit back hard”: Virginia Democrats redistricting plans approved, enraging Trump

April 22, 2026
Pete Hegseth Ditches Flu Vaccine Mandates

Pete Hegseth Ditches Flu Vaccine Mandates

April 22, 2026
Smart Again

Stay informed with Smart Again, the go-to news source for liberal perspectives and in-depth analysis on politics, social justice, and more. Join us in making news smart again.

CATEGORIES

  • Community
  • Law & Defense
  • Politics
  • Trending
  • Uncategorized
No Result
View All Result

LATEST UPDATES

  • NSA Comments On Missing Scientists
  • Republicans Try To Overturn Election After Virginia Makes Trump’s Gerrymander Backfire
  • Trump’s cruel plan for Afghan refugees, briefly explained
  • About Us
  • Advertise with Us
  • Disclaimer
  • Privacy Policy
  • DMCA
  • Cookie Privacy Policy
  • Terms and Conditions
  • Contact Us

Copyright © 2024 Smart Again.
Smart Again is not responsible for the content of external sites.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Home
  • Trending
  • Politics
  • Law & Defense
  • Community
  • Contact Us

Copyright © 2024 Smart Again.
Smart Again is not responsible for the content of external sites.

Go to mobile version