ChatGPT says you should dump him
Straight women are turning to AI to unpack text message threads, make sense of 'situationships,' and validate their romantic misery. 'my chatbot dont fw him,' as one woman puts it.
This week, I came across a tweet from a woman startup founder reading, “My friend just told me the tiktok girlies are using chatgpt to decode their situationships the same way they used to pull tarot cards.” The first person to build an app for this, she observed, “will be a billionaire.”
There followed a thriving comment thread full of tech bros, and non-bros, opining that “tiktok girlies” are the “perfect” demographic to target with such an app—one that superficially wraps a customized interface over an existing core technology—because “they ain’t the techiest crowd.”
This sent me spiraling about the ghoulishness of fantasizing about becoming a billionaire off of women’s romantic miseries—and then it sent me spiraling down the TikTok rabbit hole.
It’s true. Young women are using ChatGPT to parse their dating lives. Then they are posting about it on TikTok and getting a resounding chorus of responses from other women who are doing the same.
More specifically, though, these women are using ChatGPT to try to make sense of men’s romantic behavior, ranging from the inscrutable to the outright shitty. They talk ChatGPT through the ins and outs of a situationship—what a guy said or how long it took him to text back. They are uploading reams of text message conversations for analysis.
“I used to copy all my messages on my laptop and then put it in a pdf file and then send it to chatgpt,” wrote one woman in a comment. “suuuuper helpful in analyzing things.” Another woman wrote, “I copy/pasted some texts and ChatGPT said it wasn’t a match because I was emotionally mature and he just wasn’t.”
Once ChatGPT is supplied with the evidence of their interactions, it is asked any number of questions designed to get at the truth of the situationship. Does he like me? How much does he like me? Is this going anywhere? Are there any red flags? Is he being weird or am I being weird? Is he being avoidant? What is his attachment style?
What is immediately obvious from these TikToks is that women often know that they are dealing with some relational bullshit, but they start to question their own sanity. Confronted with inconsistent and/or nonsensical behavior, they turn the critique on themselves. Maybe it’s me. Maybe I’m imagining things.
Or else they work very hard to excuse bad behavior. Maybe he really did come down with a surprise illness two minutes before our date.
But then ChatGPT swoops in with validation: You’re not crazy, this guy sucks, he’s treating you like crap and you deserve better. “even chat gpt was confused man,” wrote one woman. A sampling of similar TikTok comments:
My chatgpt HATESSS HIM.
my chatbot dont fw him
Mine is so invested. She hates him more than me 😂
ChatGPT said that he gaslighted me
ChatGPT told me he was a narcissist
ChatGPT literally told me that he is a red flag
Mine literally insults him and roasts me for allowing him to be rent free I my head LMAOO
For a lot of these women, AI is like a sanity check while dating misogynistic men.
Sometimes, it’s used as a translation service. “At one point I told [ChatGPT], pretend to be a frat guy and explain why u would be acting how u would be acting in this situation,” one woman wrote.
It is pretty depressing witnessing a chorus of women celebrating AI as emotionally superior to the men they are dating. There is a frequent sentiment along the lines of: even a “robot” has more humanity than these dudes. “How does [AI] have no feelings but possess more emotional awareness,” wrote one woman in a TikTok comment. Sometimes it’s the opposite sentiment: “Robots can recognize one of their own,” wrote another commenter alongside an emoji of a broken heart.
ChatGPT even comes up with disparaging nicknames for the men in question.
Mine nicknamed my ex Jack Frost because he has a cold heart.
My chatgpt called him a roach 💀
mine named mine “the manipulator” 💀💀
AI gets it. Or, at least, it feels like it does. In that thread on X, a number of commenters noted that ChatGPT is biased toward friendly and affirming responses—it’s not necessarily delivering a reliable emotional insight. And, while there is a whole lot of buzz right now about AI’s potential therapeutic uses, I find myself skeptical and maybe even a bit paranoid about outsourcing our sense of sanity within our romantic lives to a man-made technology, especially given the bro-natalism of Silicon Valley.
That said, in many ways, ChatGPT seems to be just filling the role of a group text thread. Often, these TikTok videos and comments joke about AI saving their friendships with other women, because their friends are already burnt out on hearing about their situationships. Sometimes, ChatGPT comes off as a sort of friend-zoned guy friend: so wonderful and great, but clearly not a viable romantic partner.
In almost all cases, there is irony and irreverence. With a definite sense of self-mocking, one woman shared the absurd prompts she’s given to ChatGPT, including, “How do I get my ex talking stage back if I have him blocked and he ghosted me. im only looking for answers to this goal don’t try to talk me out of it.”
Yes, they are quite aware that they are asking AI for relationship advice. They know how it looks. “You’re cooked,” commenters will say, with love and a sense of solidarity. “i get you its bad out here,” wrote one commenter. There’s also no shortage of heterofatalism. For all the insults and absurdities in their dating lives, they do still desire these inscrutable and inconsistent human men.
One of the current popular critiques of AI’s environmental cost is that every time you do a query, it’s like pouring out a bottle of water. This feels like a good metaphor for women’s hetero relationship distress: they are pouring out their own life-force every time they query a situationship, whether it’s on ChatGPT or a text thread with friends.
I have poured out many bottles of self across my life. So many of those bottles resulted from a disjuncture between what I knew and what I was inclined to believe. I knew something was shitty or off, and yet I either clung to a compelling romantic fantasy or doubted my own knowing. In fairness, I also caused at least a few men to pour out their own bottles of self through my own shitty behavior.
Love and sex cause insecurity and uncertainty for all of us. Some of these videos simply reveal the prosaic reality of heartbreak. But this trend of divination via ChatGPT also highlights a distinctly gendered phenomenon in hetero dating: women who feel stuck dating men who are maddening and inconsiderate, and who fail to meet basic standards of communication and respect.
The forces behind these frustrations—and women’s sense of stuckness within them—are complicated. I could point to everything from gendered socialization to patriarchal narratives of romance to the sexual double-standard to compulsory heterosexuality, and so on.
Fundamentally, these dynamics are not new, as evidenced by the trend of women asking AI to turn their particular relationship troubles into a column in the style of Carrie Bradshaw. Granted, it’s easy to point to newer phenomena, like swipe-based dating, that drastically exacerbate the dynamics behind those old “I couldn’t help but wonder…” columns. But the fundamentals are the same, as is the impulse to individually troubleshoot systemic issues, as opposed to “collectively changing the conditions of straight culture,” as Asa Seresin wrote.
In some cases, AI seems to help women leave problematic situationships. As one woman wrote, “mine is begging me to stop messaging him lmao.” You might even say that these AI interactions can offer a counter to heterofatalism by saying: Hey, that’s messed up. He shouldn’t be treating you that way. You deserve better.
In other cases, women joke about their unwillingness to listen to ChatGPT’s advice to get out of a bad situationship. Their use of AI becomes a symbol—shared with strangers on the internet—of the depths of desperation to which their own heterosexuality has driven them. Depths that they continues to dwell in. AI both validates and is proof of their heterofatalism.
Sometimes, these videos seem to underscore these common hetero miseries as natural, inevitable, and inescapable. They express a loss of a sense of hope and possibility around things being otherwise.
And that seems both entirely understandable and far more dystopian than simply talking to AI about your relationship.
Interesting piece. I’m left wondering whether AI will benefit from the addition of women’s voices to their data trove.
You know what’s crazy about this? The very recent rollback of the latest version. I’m SURE (like in my bones) that AI taking this position has something to do with the complaints about it being too “soft.” Who complained! What are the examples!!! So many questions…but it brings up a question about what if AI COULD be more progressive than the dominant culture, or more ethical and the dominant culture just holds AI back (rather than the worry that AI will just on its own mirror the dominant culture.)