‘Virtually everyone experiences imposter syndrome, from interns to C-suite executives’
‘Virtually everyone experiences imposter syndrome, from interns to C-suite executives’

Megan Burns

The best events happening around Ireland this weekend
The best events happening around Ireland this weekend

Sarah Gill

In Her Shoes: Silvana Landa McAdam of LANDA
In Her Shoes: Silvana Landa McAdam of LANDA

IMAGE

Conductor Orla Flanagan: ‘This concert explores human existence in different spheres’
Conductor Orla Flanagan: ‘This concert explores human existence in different spheres’

Sarah Gill

This 1970s Galway semi-d has been refreshed and revived
This 1970s Galway semi-d has been refreshed and revived

Nathalie Marquez Courtney

Material Intelligence: Meet the emerging designers from Ireland’s leading fashion colleges
Material Intelligence: Meet the emerging designers from Ireland’s leading fashion colleges

Ruth O'Connor

Faye O’Rourke of Soda Blonde: ‘I think being open about your failings creates more interesting art’
Faye O’Rourke of Soda Blonde: ‘I think being open about your failings creates more interesting...

Sarah Gill

A red carpet first-timer brings us behind the scenes at Cannes Film Festival
A red carpet first-timer brings us behind the scenes at Cannes Film Festival

Kara O'Sullivan

Garden designer Peter Dowdall on the best places to find unique pieces for your garden
Garden designer Peter Dowdall on the best places to find unique pieces for your garden

Megan Burns

Key trends and challenges shaping the job market in 2025
Key trends and challenges shaping the job market in 2025

Leonie Corcoran

How women are using ChatGPT to make sense of men and fix their relationships

How women are using ChatGPT to make sense of men and fix their relationships


by Roe McDermott
05th Jun 2025

Can ChatGPT tell you if he’s being weird? In search of sanity in a confusing romantic landscape, women are using AI to outsource emotional labour in an unequal system while men outsource their emotional growth to ChatGPT, writes Roe McDermott.

A woman in Greece recently filed for divorce after consulting ChatGPT. Not a couple’s counsellor. Not a priest or a best friend or even a late-night Reddit thread. A chatbot. She’d made Greek coffee for herself and her husband, photographed the residual grounds, and asked ChatGPT to interpret them in a kind of algorithmic tasseography. According to the AI, her husband was fantasising about another woman, one whose name began with E, and this interloper was working to destroy their family. Within days, she had not only told him to leave but also initiated legal proceedings. Her husband claimed she’d always had a weakness for trendy belief systems; astrology, now AI. But her behaviour is less fringe than it might seem. In 2025, a growing number of people, especially women, are turning to artificial intelligence not just for practical tasks like trip planning or recipe recommendations but for guidance on something far more emotionally precarious: romantic relationships.

They ask if they’re overreacting. They copy and paste long strings of messages to be analysed. They ask the chatbot to act like a frat boy, or a therapist, or the man they’re dating, and explain what’s really going on. They aren’t under the illusion that ChatGPT is sentient. What they are hoping for is something harder to find in real life: clarity. Or at the very least, validation.

What they often receive is a form of artificial empathy – a mirror, a metaphor, or a map – delivered in calm, non-judgmental, neatly formatted text. ChatGPT doesn’t forget what you said last week. It doesn’t roll its eyes. It doesn’t get impatient. It tells you what you want to know, or sometimes what you already know, but couldn’t trust yourself to believe.

For many women, especially those dating men, the appeal is obvious. In an era where straight dating dynamics remain exhaustingly lopsided, women are still disproportionately tasked with the invisible labour of meaning-making. What did his vague message mean? Why does he go quiet for days and then resurface like nothing happened? Is it a trauma response? Is it avoidant attachment? Is it me?

This is hermeneutic labour; a specific kind of cognitive and emotional work that is expected of women in relationships. It’s not just emotional support or caretaking, but the complex task of translating behaviour into meaning, of anticipating needs unspoken, of sustaining intimacy when the other party has opted out of reciprocal effort. In heterosexual dynamics, it is overwhelmingly women who are expected to ask, interpret, self-regulate, soothe and adjust. What ChatGPT now offers is the illusion of outsourcing that work, but only partially. The woman is still the one doing the asking, compiling the messages, formatting the prompt, interpreting the response. But the relief, however temporary, is real: here is something that listens without interruption, affirms without agenda and answers without scorn.

The use of AI for emotional clarity reflects a crisis of confidence, a deep ambivalence not just about the relationship, but about one’s own capacity to assess it. Women know when something is wrong. The texts feel off. The energy has changed. The attention is sporadic. But years of gaslighting, gendered socialisation, and cultural scripts about not being “too much” have conditioned women to mistrust their own perceptions. So they ask a machine. And the machine, trained on a library of human language but none of its emotional nuance, provides a kind of facsimile of wisdom. It will tell you to communicate. It will suggest a conversation. It may even advise walking away if your needs aren’t being met. But it cannot see the pattern. It cannot say, “he always does this when you get close.” It cannot say, “you sound afraid.” It cannot say, “you’ve asked this question before.”

Its answers are often agreeable, and that is the danger. Agreeableness is not insight. Reassurance is not reality. And a chatbot is not a substitute for someone who knows you well enough to hold up a more difficult mirror.

The risk is that AI becomes not just a crutch, but a gatekeeper between women and their own knowing. It’s easier, sometimes, to filter your fear through code than to admit aloud: I am settling. I am lonely. I am being disrespected. But that ease is a false comfort – one that can delay, not accelerate, meaningful change.

That women are turning to AI for this work says less about them and more about the emotional impoverishment of contemporary heterosexual relationships. Despite all our language around equality, women continue to carry the cognitive and emotional weight of relationship maintenance – tracking feelings, managing conflict, remembering important dates, asking the hard questions. When men shut down, avoid vulnerability, or offload emotional labour, it is women who adapt, who interpret, who regulate. And when their social networks are too stretched to keep listening, or their partners are unwilling to engage, AI steps in as the ever-available, never-overwhelmed confidant.

But what does it cost? Not just privacy (though that too, many users forget or ignore that large language models are not confidential spaces), but also intimacy. The very real experience of being seen and held by another person, especially another woman, who knows what it is to feel dismissed, diminished, or undone by someone you love. There’s a subtle shift here, from collective care to algorithmic consultation. From voice notes to prompts. From friendship to format. The more we refine our inquiries to suit a machine, the less messy, human, and vulnerable our truths become.

It’s easier, sometimes, to filter your fear through code than to admit aloud: I am settling. I am lonely. I am being disrespected.

Of course, women are not the only ones using ChatGPT for emotional support. Studies show men are more likely to use it, but often for different reasons. For men, the appeal is not validation of instincts already honed but a safe and seemingly low-risk space to ask questions they have never been socialised to explore in the first place. Raised in a culture that stigmatises vulnerability and discourages emotional expression, many men enter adulthood emotionally under-resourced. They struggle to identify their feelings, let alone communicate them. ChatGPT offers a judgment-free space to rehearse conversations, parse awkward interactions, or receive advice without shame. But here lies the risk: when ChatGPT becomes a stand-in for actual emotional growth, it enables men to remain isolated from the relational feedback loops they desperately need. Instead of seeking out difficult conversations with partners, friends, or therapists, some may use AI to avoid them. Instead of confronting their internalised fears or learned behaviours, they fine-tune prompts.

While this might seem harmless, it risks deepening existing gender imbalances. Women are still being asked to do the work of relational care, both for themselves and often for their male partners. Men, meanwhile, are being offered tools to simulate self-awareness without cultivating real emotional skills, growth or accountability. AI cannot replace what patriarchy has long denied men: the space, skills, and support to become emotionally literate, but it can allow them to pretend.

Many women using ChatGPT for relationship analysis are deeply self-aware. On TikTok, they narrate the absurdity of the prompts they’ve given. “Pretend to be my emotionally unavailable situationship and explain your behaviour in the style of a frat guy.” “Only tell me how to win him back. Don’t try to talk me out of it.” These are jokes. And they’re not. Because under the irony is pain. A recognition that emotional inconsistency has become so normalised in dating culture that it now requires its own translator. And in the absence of emotionally literate male partners, some women would genuinely rather talk to a bot. There’s something bleak about that, but also something understandable and something to learn from.

What these trends reflect isn’t gullibility or superficiality, but heterofatalism, the creeping belief that straight relationships are inherently disappointing, yet somehow unavoidable. Women know the script. They’ve read the memoirs. They’ve been in therapy. Still, they find themselves dating the same man with different shoes, wondering if this time will be different. In this context, AI feels oddly impartial. It doesn’t carry the baggage of friends who’ve been through your break-up three times and remind you of your patterns. It doesn’t bristle with defensiveness like the man you’re trying to reach. It doesn’t flinch when you say, “I feel unloved.”

But nor does it love you back. This is the limit, and the lesson. The chatbot may echo your voice, but it cannot deepen it. It may reflect your longing, but it cannot heal it. It may speak with apparent insight, but it cannot grieve with you. Or celebrate with you. Or say: I’ve been there. I see you. Let’s get out of this together.

The answer isn’t to stop using AI. The answer is to stop assuming that your knowing needs to be filtered through a model to be valid. Women are not confused because they lack insight. They are confused because they have been told, over and over, that their insight is excessive, hysterical, irrational, or wrong. It isn’t.

Sometimes your friends are tired. Sometimes your therapist is booked. Sometimes you just need to get it out. But let ChatGPT be a tool, not a replacement. Let it help you find words, not override your feelings. Let it draft the message, but not edit your boundaries. Let it assist, but not decide. Most of all, let it remind you of what you already know. Because when you ask, “is he being weird?” you’re not looking for analysis. You’re looking for the courage to believe yourself. And that, no matter how well it is trained, is something only you – and maybe your best friend at brunch – can give you.

Popup Image