The most valuable thing an AI will ever know is you. The question of who owns that knowledge is the defining property rights question of the AI age.
This paper argues that personal context — the accumulated decisions, relationships, domain knowledge, and temporal state that make AI useful to you specifically — is property. Not platform data. Not training material. Not a feature of your subscription. Property you own, control, export, inherit, and delete.
We survey what platforms built (and credit them for it), examine the enshittification cycle now beginning in AI memory, trace the bundling/unbundling pattern through four historical parallels, and argue that the technology for sovereign context already exists. It is called files.
Companion papers define the category (Personal Context Management) and document the engineering (Context as Code). This paper makes the ownership argument.
Here's something worth noticing: the most valuable AI feature isn't intelligence. It's memory.
An AI that remembers your project is categorically better than one that doesn't. An AI that knows your preferences saves you fifteen minutes of re-explanation per session. An AI that understands your domain — the decisions you've made, the people involved, the history of how you got here — can continue your work instead of starting from scratch.
Memory is what turns a chat window into an assistant. And right now, you don't own yours.
Your ChatGPT memories live on OpenAI's servers, in a format you can't export, structured in a way you can't control, subject to terms you didn't negotiate. In February 2025, hundreds of users lost their accumulated memories overnight in a catastrophic wipe — years of curation, gone. OpenAI's response took ten to twelve days. There is no native API for exporting your memories.
Your Claude context is synthesized into a server-side summary updated every 24 hours. Your Gemini context is inseparable from your Google account. Your Copilot context is encrypted to a single device's TPM chip.
This paper is about a simple proposition: that personal context should be property, with all the rights that word implies. Exportable. Inheritable. Deletable with certainty. Yours.
Before making the property argument, an honest acknowledgment.
Platforms built something real. They proved a concept that deserves credit.
ChatGPT Memory proved that persistent AI context is transformatively valuable. Before February 2024, every AI conversation started from zero. You re-explained your job, your preferences, your projects, every time. Memory proved that even crude persistence — a flat list of key-value pairs — creates enormous user value. Hundreds of millions of people experienced, for the first time, an AI that knows them. That's a genuine achievement.
Google's Personal Intelligence proved that cross-application reasoning is powerful. When Gemini can see your email, calendar, photos, and search history simultaneously, it makes connections you'd never make manually. The insight — that context spans applications — is correct and important.
Obsidian proved that people will invest enormous effort in organizing their knowledge. 1.5 million users, 22% year-over-year growth, a thriving community of people who genuinely believe that structured knowledge changes their lives. And Obsidian proved something else that matters: local-first, plain-file knowledge management works. People prefer owning their data.
Notion proved that structured, relational data is valuable for individuals, not just enterprises. The idea that a person would build a relational database of their own knowledge — and find it useful — seemed unlikely. Notion proved it.
These are real contributions. The people who built them deserve recognition. The question is not whether AI context is valuable — that's settled. The question is whether the next decade of AI context will be owned by the platforms that proved the concept, or by the people whose lives generate the context.
History has a clear answer to this question. But first, the pattern.
Cory Doctorow described a cycle in 2023 that is now beginning in AI memory.
"Here is how platforms die: First, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die."
The mechanism that enables each phase transition is lock-in — the ability to raise switching costs so users can't leave even as the experience degrades.
ChatGPT Memory is in Phase 1. The memory is free. It's useful. It feels generous. Users accumulate context — preferences, project history, relationship notes, domain knowledge — and each memory stored increases the cost of switching. This is the attraction phase. OpenAI is spending its investor capital to acquire your lock-in.
Phase 2 is visible in the pricing tiers. Plus subscribers already get 25% more memory capacity. Enterprise customers get API access. The feature that was free becomes the feature you pay to keep. Your accumulated context becomes the subscription's moat.
Phase 3 is predictable because it is always predictable. The platform's understanding of you — built from your conversations, your decisions, your preferences — becomes an asset the platform monetizes. Training data. Behavioral targeting. Insight resale. The memory that was supposed to serve you begins to serve the platform.
The antidote, as Doctorow has argued consistently, is interoperability and data portability. If you can leave, the platform has to stay good. If you can't, the platform doesn't have to stay anything.
Jim Barksdale, CEO of Netscape, reportedly told his board: "Gentlemen, there's only two ways I know of to make money: bundling and unbundling."
The observation is that industries oscillate. A company bundles many services together. Customers pay for the bundle because it's convenient. The bundler captures value through convenience premiums and switching costs. Then technology makes it possible to access individual components separately — often better and cheaper. Startups pick off the most valuable parts of the bundle. The bundle collapses. Eventually, a new bundler emerges, and the cycle repeats.
Platforms are currently bundling your context. Your ChatGPT conversations, your Google search history, your Notion workspace, your calendar, your email — each platform holds a piece of your context, and none has the whole picture. Google's Personal Intelligence is the ultimate bundling move: aggregate context across all Google services into a single intelligence layer. The more services you use, the smarter Gemini gets, the harder it is to leave.
Every unbundling happens when two conditions are met: technology makes the bundle unnecessary, and the bundle's extraction becomes intolerable. Both conditions are now met for personal context. Local AI and edge computing mean you don't need a cloud platform for persistent context. And platforms are visibly beginning to use your context against you — premium-tier gatekeeping of your own history, auto-enrollment without consent, catastrophic data losses with weeks-long response times.
The historical parallels are precise:
AT&T bundled the phone, the line, the switching, and the long distance. You rented your phone from them. The 1984 breakup and the Carterfone decision — you can attach any device to the network — unbundled the stack. Innovation exploded. The PCM argument is the Carterfone argument: you should be able to attach any context to any AI.
Cable bundled 200 channels to sell you the 12 you wanted. Streaming unbundled it. Then streaming re-bundled (Disney+/Hulu/ESPN). The cycle continues. Currently, platforms bundle all your context to sell you the AI features you want. PCM lets you selectively share context with any AI.
Banks bundled checking, savings, loans, investments, and payments into one institution. Switching was nearly impossible. Then Stripe unbundled payments, Robinhood unbundled investing, Venmo unbundled transfers, and Plaid unbundled data access. The plumbing for financial unbundling was APIs and open data standards. The plumbing for context unbundling is plain files and open formats.
We are at the unbundling moment for personal context.
Ray Dalio describes economic cycles driven by debt accumulation — short-term cycles of expansion and contraction, and long-term cycles where debt builds across multiple short-term cycles until it becomes unsustainable, forcing a painful deleveraging.
Personal context follows the same pattern.
The short-term context cycle: you adopt a platform. You build context. The platform leverages your context to increase switching costs. You consider leaving. The switching cost is too high. You stay. The cycle repeats with deeper lock-in each time.
The long-term context cycle: over ten to twenty years, you accumulate context across many platforms. Gmail has your email context. Google has your search context. ChatGPT has your conversation context. Notion has your project context. Your calendar has your time context. LinkedIn has your professional context. Each platform has a piece. None has the whole picture. Your context is fragmented, siloed, and increasingly leveraged against you.
The deleveraging moment is when you consolidate your context into a system you own. That's the PCM moment. It is painful in the same way Dalio's deleveragings are painful — you leave behind accumulated context, you rebuild in a new system, you accept short-term loss for long-term sovereignty. But the alternative is continued debt accumulation until the system becomes unsustainable.
A simple framework for determining whether your context is property or leverage:
Test 1: Can you export it? If you cannot receive your personal context in a structured, machine-readable format and transmit it to another system, you do not own it. You are renting access to your own history.
Test 2: Can you inherit it? If your context dies with your subscription — if your family cannot receive your accumulated understanding after you're gone — it is not property. Property survives its owner. Platform features do not.
Test 3: Can you delete it with certainty? If you cannot remove specific context and be confident it is gone — not retained in backups, not embedded in model weights, not preserved in aggregate analytics — you are not in control. The "right to be forgotten" is meaningful only if forgetting is technically achievable.
Personal context that fails all three tests is not property. It is leverage — held by the platform, used for the platform's benefit, inaccessible to you when the relationship ends.
How do current platforms score?
| Platform | Export | Inherit | Delete with certainty |
|---|---|---|---|
| ChatGPT Memory | No native API | No provision | Uncertain (training data) |
| Claude Web | Export supported | No provision | Reset only (all or nothing) |
| Gemini | Zero portability | No provision | Disconnect services only |
| Copilot Recall | TPM-bound to device | Not transferable | Local deletion possible |
| Obsidian vault | Fully portable (files) | Files in estate | Delete the file |
| PCM (files on disk) | Inherent | Files in estate | Delete the file |
Plain files pass all three tests by default. They are exportable because they are files. They are inheritable because they are estate assets. They are deletable because deletion means deletion.
If personal context is property, several things follow:
Portability is a right, not a feature. You should be able to move your context between AI providers the way you move your phone number between carriers. GDPR's Article 20 — the right to data portability — already gestures at this, requiring data controllers to provide personal data in a "structured, commonly used and machine-readable format." The question is whether AI-generated inferences about you (the model's understanding of you, not just the facts you provided) qualify as your personal data. Under any reasonable reading, they should.
Inheritance is natural. When you die, your context — your decisions, your domain knowledge, your understanding of how the world works — should pass to your heirs as naturally as your library does. A folder of Markdown files is inheritable with zero legal friction. Your executor copies the folder. No lawyers, no platform negotiations, no ticking clock before the account is deleted.
Sovereignty is the default. Your context lives on your machine, synced how you choose, shared when you decide. No platform has access unless you grant it. No model provider trains on your context unless you opt in. This is not a radical position. It is the default state of files on a computer. The radical position is what platforms have normalized: that your cognitive history is their asset.
The most durable technologies are protocols, not platforms.
HTTP turned thirty-five this year. HTML turned thirty-three. Email — SMTP, IMAP — turned forty-four. RSS turned twenty-seven. These protocols are still the most reliable, most interoperable, most resilient ways to share information ever built.
The key property they share: no one owns them. Gmail can email Outlook can email Protonmail. Any browser can read any website. Any feed reader can subscribe to any RSS feed. The protocols are substrate. Applications are built on top. The substrate endures; the applications come and go.
The open web was built on plain, readable formats. HTML is human-readable. JSON is human-readable. RSS is XML, which is human-readable. The formats are simple, durable, and universally parseable.
Personal context should be built the same way. Markdown is human-readable, machine-parseable, supported by every editor on every platform. A Markdown file written in 2014 is still perfectly readable in 2026. Can you say the same about a Notion database? An Evernote export? A ChatGPT memory that was silently wiped?
The argument is simple: when people ask "what format should personal context be stored in?", the answer has been obvious for thirty years. The same formats the web runs on. The same formats every tool can read. The same formats that will still work in 2050. Plain files.
In 2005, Linus Torvalds built Git because BitKeeper — a proprietary version control system — revoked the Linux kernel project's free license. The most important open-source project in history was held hostage by a proprietary tool's business decision.
Torvalds' response was not to negotiate with BitKeeper. It was to build a system where no platform could ever hold their work hostage again.
Git's architecture is the precedent for sovereign context:
The mapping to personal context:
| Git | PCM |
|---|---|
| Repository | Walnut (context unit) |
| Commit | Save (immutable snapshot) |
| Branch | Bundle (scoped workstream) |
| Clone | Install on new device |
| Remote | walnut.world (optional hub) |
| .gitignore | Privacy controls |
| Plain text diffs | Context changes are human-readable |
Git proved that distributed ownership of source code was not only possible but superior to centralized control. Twenty-one years later, it is the universal standard. The same architecture applies to personal context. Your context should be a repository you own, with optional remotes for sharing, full history, and the ability to work offline.
Tim Berners-Lee saw this fifteen years ago and proposed Solid — personal data pods that you control, with applications requesting access. His vision was right. The implementation was too complex for mainstream adoption. PCM achieves the same goals — sovereignty, interoperability, user control — with the simplest possible implementation: files on your disk.
This is the part that should feel almost anticlimactic. The technology for sovereign personal context is not futuristic. It is not complex. It is not expensive. It already exists, it is already proven, and it has been working for decades.
The most sophisticated context management system in existence — the one described in the companion paper "Context as Code" — is built from bash hooks and markdown files. 9,754 lines of configuration, zero trained weights, running daily for eight months. The substrate is files. Everything else is optional.
GDPR's Article 20 establishes a right to data portability: you can receive your personal data in a "structured, commonly used and machine-readable format" and transmit it to another controller. California's CCPA/CPRA adds rights to deletion and rights to know what data companies hold about you.
These laws were written for the database era. They assume data is discrete and locatable — a row in a table, a file in a folder. AI context is diffuse and emergent — embedded in model weights, conversation histories, and inference patterns. When ChatGPT generates an inference about you ("Ben prefers direct communication and is building a context management system"), is that inference your data (you generated the conversations it's based on) or OpenAI's data (they built the model that made the inference)?
Under any reasonable reading of privacy law's intent, inferences derived from your personal data are your personal data. But "reasonable reading" and "current enforcement" are different things.
The EU AI Act (2024) adds AI-specific regulations. India's Digital Personal Data Protection Act (2023) builds domestic data infrastructure to reduce platform dependence. The indigenous data sovereignty movement — the CARE Principles for Indigenous Data Governance — argues that communities should control data about them.
The direction is clear: more sovereignty, more portability, more individual control. The legal frameworks are converging toward the proposition that your data — including your AI context — is yours.
But here is the thing: if your context lives in files on your disk, the legal question is moot.
You don't need GDPR to export a Markdown file. You don't need CCPA to delete a folder. You don't need a data portability regulation to copy your files to a new machine. You don't need a lawyer to include a directory in your will.
The property rights argument is important for the platforms. They need laws to force them to treat your context as yours. But for individuals who adopt PCM, the legal fight is already won — because there is no platform to fight. Your context is files. Files are property. The end.
Balaji Srinivasan draws a distinction between voice and exit. Voice is trying to change a system from within — petitioning, protesting, negotiating. Exit is leaving the system entirely and building an alternative.
Voice says: "OpenAI, please make ChatGPT memories exportable."
Exit says: "I'll keep my context in files. Any AI can read them."
Voice says: "Google, please don't auto-enroll me in Personal Intelligence."
Exit says: "My context isn't in Google. It's on my machine."
Voice says: "Platforms, please treat our data as our property."
Exit says: "It's already our property. It's a folder."
PCM is an exit technology. It does not ask platforms to be better. It makes platforms optional. When your context lives in files you own, the platform becomes a service provider — one you can replace — rather than a landlord holding your cognitive history hostage.
This changes the power dynamic entirely. A platform that knows you can leave has to compete on quality. A platform that knows you can't leave competes on lock-in. The history of technology is the history of reducing lock-in: number portability for phones, open banking for finance, data portability for personal information. Context portability is next.
This paper is not a manifesto. It is not a call to burn down platforms or boycott AI tools. Platforms will continue to build valuable AI features. Some people will prefer the convenience of platform-managed context. That's fine.
This paper is an invitation to consider a different default.
The default today: your context is scattered across platforms, in formats you can't control, subject to terms you didn't negotiate, at risk of wipes you can't prevent. You hope the platforms stay good. You have no plan for when they don't.
The alternative default: your context lives on your machine, in plain files, in a structure that any AI can read. You share what you choose. You keep what you keep. When you switch models, your context travels. When you switch tools, your context survives. When you stop paying a subscription, your context remains. When you die, your context passes to your family like any other possession.
The technology for this exists today. It is not complex. It is not expensive. It is not theoretical. It is files.
Xerox proved the GUI. Apple owned it. Netscape proved the browser. The open web won. Platforms proved AI context. Ownership is next.