Bottom Line
On May 2, 2026, it was disclosed that a federal judge has ordered OpenAI to preserve all ChatGPT conversation logs from May to September 2025 — even those users manually deleted. Approximately 20 million logs will be submitted as evidence in the New York Times lawsuit. This shatters the user assumption that “deleting a conversation = data erased.”
What Happened
Key timeline:
| Date | Event |
|---|---|
| Oct 27, 2025 | OpenAI completes mental health analysis (lawsuit-related document) |
| May 2, 2026 | Disclosure of federal data preservation order |
| May 13, 2026 | Preservation order PDF to be published |
| May-Sep 2025 | Conversation period covered by order |
| ~20 million | Estimated logs to be submitted |
OpenAI COO Brad Lightcap has issued an official response. OpenAI also updated its Data Controls FAQ page.
Why It Matters
1. “Delete” Doesn’t Mean “Delete”
Most ChatGPT users believe: clicking delete = data disappears from servers. The preservation order reveals:
- User-side deletion is a front-end operation only
- Server-side data retention is controlled by OpenAI
- In legal proceedings, all data can be compulsorily extracted
2. 800 Million Users’ Data Governance
OpenAI has approximately 800 million users. Even a fraction using ChatGPT during May-Sep 2025 generates astronomical data volumes, potentially containing:
- Enterprise information (code, business plans, customer data)
- Personal privacy content
- Mental health-related conversations (analysis documents already exist as evidence)
3. Legal Classification of AI Conversations
This is among the first cases of large-scale AI conversation data being compulsorily extracted by courts. It raises unresolved questions:
- Do AI conversations enjoy the same privacy protection as emails?
- How valid are “data retention” clauses in user agreements?
- When employees use ChatGPT, do their conversations belong to the enterprise?
Comparison: Major AI Platform Data Policies
| Platform | Post-Deletion Retention | Legal Response | Enterprise Data Isolation |
|---|---|---|---|
| ChatGPT | Yes (confirmed by this case) | Preservation order received | Enterprise version has isolation |
| Claude | Per enterprise agreement | No public cases | Team/Enterprise has isolation |
| Gemini | Depends on Workspace settings | No public cases | Workspace has isolation |
| Local Models | Not transmitted | N/A | Fully isolated |
Landscape Assessment
This event may become a watershed for AI industry data privacy:
- User Trust: The real meaning of “delete” will be re-examined
- Enterprise Adoption: Compliance reviews of cloud AI services will tighten
- Local Deployment: May accelerate trend toward local/private AI deployment
- Regulatory Follow-up: More jurisdictions may issue specific AI conversation data regulations
Action Items
- Individual users: Don’t input truly sensitive info into ChatGPT; deletion ≠ disappearance
- Enterprise users: Evaluate Enterprise version upgrade (has data isolation); update AI usage policies
- Compliance teams: Include AI conversation data in data governance frameworks; watch for May 13 preservation order PDF
- Developers: Consider providing clear data retention disclosures in your apps
- Alternative evaluation: For privacy-sensitive scenarios, prioritize local deployment (LM Studio + local models)