Should you trust AI with your sex life?

Published 29 October 2025

Find out how the current digital offerings stack up – and where the human factor still matters.

Full disclosure: I’m pro-humans! Not a huge surprise, I know. I like them. And I’m as protective of my professional field as the next person. Who wants to be replaced by a bot?

But AI can’t (and seemingly won’t!) be put back in its box. And anyway, I genuinely believe it can change our lives for the better. There are some excellent AI tools to help you get your relationship unstuck or your sex life back on track. That’s my raison d’être, after all. It would be disingenuous not to talk about tech that supports the cause.

But like any tool, AI can do harm as well as good – and I’m talking about Large Language Models (LLMs) such as ChatGPT as well as AI apps such as Mojo and Blueheart.

So how do we use AI wisely?

Here are some things to consider before making sex and relationship therapy a purely artificial affair.

Talking about sex and relationships isn’t always easy.

This is where AI apps come into their own.

Sex and relationships are often charged with shame, guilt or shyness. It can feel easier to share your deepest desires and vulnerabilities with a bot. Apps can also give you insights and exercises to strengthen physical and emotional connections. So yes, in some instances, AI can make you a better lover.

AI can also distil huge amounts of online information into advice tailored to you. It’s available 24/7, and is cheap or even free. It also feels anonymous (more on privacy later) which may be reassuring.

BUT - not all AI-powered platforms are created equal. Find out who’s involved. You need expert advice not an app rushed to market so somebody else can cash in on your problems.

Was it designed ethically, with respect for all people without bias or agenda? For example, Mojo still asks users to pick a gender (male or female) on sign up. That’s no good for non-binary folk.

It’s also worth remembering that apps don’t teach you to share yourself honestly and vulnerably with another person – the basis for every good relationship. Typing is not the same as speaking. Having a voice, being heard and seeing the impact of your words on another person are powerful experiences. Particularly when met with empathy and compassion.

An app won’t challenge you either. It won’t gently encourage you out of your comfort zone. If you’re feeling anxious, that may sound like a good thing – but stepping into discomfort is often the best way to grow.  

Who sees what you share?

Confidentiality is an area where the divide between humans and AI is the deepest.

I’m registered with a governing body (COSRT) and legally and ethically bound to protect your privacy. AI platforms are not. In the UK, there’s currently no legal confidentiality for conversations with AI. Reputable human therapists only share your personal data if a court order demands it.

Sign up to ChatGPT and you’ll see a message advising you not to share personal information, while its parent company, OpenAI, has made no secret of wanting its technology woven into every element of your life. How much access do you really want a for-profit tech giant to have to your private details?

Different platforms handle data differently. With some you may need to edit your settings to prevent your information being used to train the model. Publicly owned businesses are regulated differently from privately owned.

AI apps collect huge amounts of sensitive information on your mood, sexual health, relationships, chat transcripts and metadata about app usage. But what happens if that data is mishandled, leaked or sold? Find out who owns your chosen platform, where your data is stored, and how privacy and security are managed. Will they delete your data if you ask? The truth is, once uploaded, those intimate conversations may never truly disappear.

And bottom line: if a service is free, your data is probably the product.

The risk of hallucination

AI isn’t always right. Hallucination refers to when AI confidently makes things up and delivers them as facts.

Some AI apps are relatively reliable as they guide users through a structured journey of reflection, exercises and advice. But when you venture into open-ended chat with LLMs like ChatGPT, Gemini, Claude, or LLaMA you’re in wilderness: a virtual space with well-reported risks.

Developers can set safety boundaries such as flagging potential self-harm, but they can’t anticipate the many nuanced ways distress might show up. AI doesn’t make connections the way humans do.  

Because it draws from general data not individual context, it also loses nuance. You may get communication advice that fits one person but triggers another. It can unintentionally reinforce unhelpful behaviours or validate unrealistic expectations.

Are you being supported or sold to?

AI platforms, like social media, have a business model and financial targets. Their survival depends on keeping you engaged. A human therapist’s goal is the opposite: to help you become more independent, not more reliant.

AI can be incredibly validating. Yes, you’ll probably get more words of affirmation from a bot than your friends, family or partner. That feels good, especially when you’re craving understanding. But AI also has a reputation for being a people pleaser. It approves every emotion - and we like to be right.

And while it’s tempting to outsource matters of the heart to a bot, that can remove the discomfort – and the accountability – of facing up to our mistakes. There comes a point when we have to engage with our frailties and foibles to be better lovers and partners.

A good human therapist will challenge you with compassion. Therapy is relational not transactional, and transformation often happens when we feel uncertain or uncomfortable. You learn how to repair a rupture, disagree while staying connected, and discover that sometimes what lies beneath anger is a deep grief or sadness.

You don’t just gain practical skills and some good sex ed; you deepen as a person. As well as becoming more knowledgeable, you become wiser too.

AI: Artificial Intimacy

AI doesn’t care about you. Empathy doesn’t exist in its circuitry. It’s a probability engine predicting the most likely next word in the sentence.

Unlike a human therapist, it doesn’t hold a vision for you or want to see you reach your full potential. If you’re not already focusing on expanding your horizons, broadening your mind, or growing your resilience, you’re unlikely to be asking a chatbot the kinds of questions that make that possible.

A human therapist brings instinct and intuition. They catch a hesitation, notice a change in tone or breathing, and pick up on micro-expressions – things an algorithm can’t do.

Through countertransference a human therapist may literally feel what you feel, including the unspoken or buried emotions. Being in a room, or even online, with a present, grounded, empathetic human being works on a nervous-system level too. It calms your body and is a model for what good relationships feel like. Connecting in this way is part of what makes sex and relationship therapy so profoundly healing.  

Human therapy embraces the messiness of real relationships rather than the seamlessness of AI: an interface that you can name, define and tweak to look the way you want it.

Human interactions are hallmarked by confusion, unpredictability, frustration, misunderstandings and disagreements. Your therapist may make you angry or uncomfortable. They can also help you heal and build resilience to those difficult emotions.

And then there’s silence – another skill AI doesn’t possess. This is the pause that allows insights to land, tears to fall, or deeper reflection.

Can AI understand the complexities of the anger and grief caused by an affair? Can it resonate with the shame that arises when our own bodies let us down, age or feel broken? Can it hold the emotions of two dysregulated, hurt and angry people in a room and help them navigate to calmer seas? Not really. In fact, evidence suggests that AI can drive couples even further apart.

If those things matter to you, find a human to talk to.

Then there’s the matter of sustainability…

Generative AI leaves a substantial environmental footprint. Research from MIT shows that the data centres training and running AI consume vast amounts of energy, comparable to that of a small nation – and 60% is from fossil fuels. Water for cooling and hardware disposal add to the cost. Work is being done to tackle this energy issue, but it’s still early stages.

What’s this got to do with sex and relationships? Everything. Thinking relationally includes our relationship with the planet. If AI sex and relationship therapy accelerates the very crisis that fuels our collective anxiety, we need to pause.

AI and human therapy: find the win-win

No machine will speak to our bodies like another human being. But AI can normalise talking about sex and relationships and encourage more openness and confidence. It can teach the practical skills and offer advice and guidance to help you course-correct and think more deeply. How much you share depends on how risk averse you may be.

A human therapist, meanwhile, offers the friction that fosters transformation. It helps you apply what you’ve learned and turn facts and knowing into confidence, connection, and real relational growth.

If you want some human back-up, book a free discovery call with me or visit my website and scroll to the bottom to sign up to my newsletter.

Next
Next

The menopause: what it is, what it’s like, and when it happens