This article pulls together insights from a ZipHealth survey of over 1,000 people in the United States and Canada. It digs into why more folks seem willing to explore romantic and sexual interactions with AI and robots.
The survey highlights who’s open to these new forms of intimacy, what draws them in, and what risks might come with it. As tech starts to meet basic needs for attention and reassurance, the findings nudge clinicians, researchers, and policymakers to think hard about where intimacy is headed.
What the survey reveals about AI-driven intimacy
People seem pretty open to AI-based romance and sex. Around 19% of participants say they’ve already tried romantic or sexual chats with AI, and about 23% would consider sex with a humanoid robot.
This trend stands out even more among younger generations. It hints at a shift in how folks might negotiate intimacy in the future.
Many respondents say talking to AI feels easier and less awkward than talking to real people. Actually, 55% described AI as more comfortable.
But that comfort comes with strings attached. AI can’t give real intimacy, independent opinions, or the kind of social tension that helps people grow.
There’s a big social risk too. About half of those who’d had AI interactions admitted hiding them from their partners, which could spell trouble for trust in relationships.
- About 19% have already interacted with AI chatbots romantically or sexually.
- About 23% would consider sex with a humanoid robot.
- Gen Z leads with roughly 26% openness; millennials trail at 19%.
- 55% find AI conversations less awkward than real conversations.
- Half hid AI interactions from partners, introducing secrecy and potential trust issues.
- 29% of women cite loneliness as a driver for AI relationships.
- About 75% doubt AI intimacy will be harmless for real-world relationships.
Gen Z leads the trend
The age profile paints a clear picture: younger people are more likely to embrace AI-driven intimacy. Gen Z (about 26%) is more willing than millennials (around 19%).
We might see digital companionship become even more normal as these generations get older. How will this affect dating culture or what we teach about consent and digital safety? It’s hard to say.
Benefits, appeal, and tradeoffs of AI intimacy
People who like AI connections mention convenience, nonjudgmental listening, and instant reassurance. For some, AI chatbots offer a private, low-pressure way to explore romantic or sexual ideas.
But these perks have a flip side. They’re at the center of debates about what real intimacy means now that machines can mimic it.
Why convenience doesn’t equal authentic connection
AI can fake warmth and responsiveness in conversation, but it can’t give real emotions or independent opinions. That leaves a gap.
Can AI interactions ever replace human relationships? Should they? It’s not clear, and honestly, it’s a little unsettling.
Risks to relationships and privacy
One big worry is how this could disrupt real-world relationships. Half of users hide their AI interactions, which could chip away at trust between partners.
There’s also the privacy angle. AI systems collect data to personalize chats, so if protections fall short, there’s a risk of breaches or misuse.
Loneliness as a driver and a warning
Loneliness stands out, especially for women—29% say it’s why they turn to AI. That’s a signal that AI companionship might be filling a gap, but it doesn’t really solve the bigger need for human connection, community, or support.
So, as we figure out what AI intimacy means, we’ll need to balance comfort with social development and safety. No easy answers here.
What this means for the future of intimacy
The survey points to a cultural crossroads. Technology is starting to meet basic human needs for attention and reassurance.
Will society redefine intimacy around entities that can’t truly feel or care? For scientists and policymakers, it’s a tough call—one that raises tricky questions about ethics, consent, transparency, and what happens to marriage and family down the line.
Guidance for researchers, clinicians, and policymakers
Recommendations for responsible development and study include:
- Researchers should run longitudinal studies. That way, we can see how AI intimacy affects mental health, relationships, and social skills as time goes on.
- Develop clear privacy standards and give users control over their sensitive data during AI-human interactions.
- Add ethics and consent education to digital literacy programs. It’s important to highlight the difference between simulated companionship and real human connection.
- Clinicians should support approaches that address loneliness and social isolation. But let’s not lean too heavily on AI as a replacement for real human contact.
Here is the source article for this story: Gen Z Won’t Stop Having Sex With AI Chatbots