In Greece, a woman has reportedly filed for divorce after using Chatgpt to “read” her husband’s coffee cup and being told that he was having an affair. According to Greek City Times, the woman turned to the AI tool in a modern take on tasseography, an old method of reading coffee grounds or tea leaves to predict someone’s future.
Woman Files for Divorce After Chatgpt’s Coffee Grounds Reveal Affair
The couple, married for more than 12 years and parents to two children, had their coffee one day when the woman decided to upload pictures of the leftover grounds in their cups to Chatgpt. She hoped the AI could interpret the patterns like a traditional fortune-teller. What she got instead was a shocking claim.
The chatbot allegedly told her that her husband was "destined to be with a mysterious woman with the initial 'E'," and that her own cup revealed signs of "betrayal and a threat to her household." It even claimed that the man was having an affair with a younger woman, trying to break their family apart.
The husband later appeared on a local TV channel and dismissed the claims. He said his wife had a history of following internet trends, but this time it went too far. "I laughed it off as nonsense. But she took it seriously. She asked me to leave, told our kids we were getting divorced, and then I got a call from a lawyer. That’s when I realised this wasn’t just a phase," he said.
Just three days after receiving the AI’s message, the woman formally served him divorce papers, refusing to discuss a mutual separation. The husband also shared that his wife had previously believed in other mystical ideas. "A few years ago, she went to an astrologer, and it took her nearly a year to finally admit it was all nonsense," he added.
His lawyer confirmed that the AI’s so-called coffee reading is not considered valid proof in court. "He is innocent until proven otherwise," the lawyer said.
Experienced tasseography readers have also shared their thoughts, pointing out that a proper coffee cup reading involves examining not only the grounds, but also the foam and the saucer, details that Chatgpt could not analyse.