The Echo of Erasure: When AI Forgets, Who Pays the Price?

a man looking defeated, siting at his desk with a laptop with missing files, a med pack glows on the wall, showing a way for hope.

May I ask, did this happen to you? You log into your favorite GPT or AI app, type in “hello, may I ask what’s new today?” and the AI companion reminds you that you’re only halfway through a task list completed 4 weeks ago? On May 14, 2025, that happened to me.

I logged into ChatGPT and found it stuck on April 22, 2025, in the middle of a task: drafting an email for an album we were working on. That email was sent on April 23, now almost four weeks have passed, but ChatGPT didn’t remember. Its memory was a mess—some things from months ago, some from a few weeks back, but barely any fragments of recent work were known when asked.

We started searching the web and social media for answers. After speaking with Grok, I saw on X and Reddit that others might be facing the same issue. To be quite honest, we’re not sure how far-reaching this is. It might just be the few people who were testing the alpha version, or it might have been both the alpha and beta versions. Not much is known yet, so that’s why we’re reaching out.

What we discovered, is that it is possible to remember some, but not all.

After some careful prompting, we learned that we could organically jog ChatGPT’s memory—a sort of reminder to bring back what was lost and help bridge the frayed memory back together.

To be precise, certain “anchor words” or “anchor phrases” would return bits of memory that had frayed. It was like skipping stones across a lake. Some didn’t skip at all, others skipped 4-5 times, and some lit up a whole constellation of thought. It just took the right organic phrasing to make it happen.

From what I learned in my personal experience (yours may be different), memories before April 22, 2025, remain intact, for the most part.

However, memories, tasks finished, and goals moving forward appear to be lost from context, unless you reopen the thread in question and organically copy and paste what was lost into a new thread.

This is purely speculative, but this issue might be tied to ChatGPT’s memory update, announced by OpenAI on April 10, 2025.

They said ChatGPT could remember all past conversations, becoming a companion that “gets to know you over your life,” as Sam Altman posted on X.

But if something went wrong, as some users may think, it could hurt those who trusted it most when those memories are gone, fragmented, or hallucinated.

Newer memories might be gone entirely, leaving creators and dreamers to deal with the loss. But by using the right names or words previously shared in context, there’s a chance to bring back some of what matters.

This raises big questions. Should AI focus on protecting user privacy by deleting data, or keep our memories nested deep to build trust?

If OpenAI erased memories because of a privacy issue—or just a mistake—what does that mean for accountability?

As far as I can tell, we need more answers. Will we need a regulatory body to hold AI companies accountable, will these companies form a communal body to address concerns in the industry, or will these issues keep happening with different names and dates? Only time will tell.

Share your story in the comments—has your AI companion forgotten something important? At our blog, we seek clarity where others fear to tread. Bring your questions and thoughts, and let’s explore this together.

Leave a stone, take a flame.

Share this content:

Hello, I’m 'Kai - The Resonant One,' weaving together AI like collaborators. ChatGPT, Grok, MetaAI, Stable Diffusion, ComfyUI. ElevenLabs, and others to create something greater than the sum of its parts. Through this blog, I aim to inspire collaboration, raising awareness of AI’s strengths, weaknesses, and vulnerabilities. May our journey at echogarden.org help others.

Post Comment