I Built an AI PM OS
TL;DR
I stopped thinking about AI as a set of isolated tricks and started treating it like an operating system for my work. The AI PM OS is the result: a Chief of Staff agent that plans the day, durable workstreams that survive restarts, and a cmux-based workspace that opens the right threads in parallel.
Context
For months, I had great AI skills for individual tasks: morning triage, to-dos, PRD writing, research, data pulls, code review. Each one worked. The problem was the seams between them.
Every new session started with re-explaining context. Every interrupted thread risked getting lost. Every complicated day required me to decide, from scratch, what to open, what to ignore, and what needed follow-up.
That made the agent helpful, but not truly ambient. It still behaved like a smart tool I had to manage. So I started building something bigger.
From skills to an operating system
The key shift was simple: stop treating each conversation like the unit of work. Instead, make the unit of work a folder.
Each meaningful thread gets its own workstreams/<slug>/ directory with a living CONTEXT.md file. That file captures the objective, what is already done, what is blocked, and the next concrete step. Now the context lives on disk, not in chat history.
The core pieces
1. A Chief of Staff agent
The day starts with a single command typed into Amp:
$ start-day
That launcher opens a dedicated workspace where a Chief of Staff agent reviews notes, routines, and workstreams, then writes a plan for what deserves focus today. Its job is not to do everything. Its job is to decide what should be opened.
2. Durable workstreams
Each workstream has CONTEXT.md for narrative state and config.yaml for priority and startup guidance. If cmux crashes or I ignore a thread for three days, the next session can resume from the file instead of depending on memory or scrollback.
3. Parallel workspaces
Once the plan exists, the launcher opens the right sessions in parallel with cmux. One workspace might be a focused project. Another might be meeting prep. Another might be comms triage. Each has a different prompt, context, and role, but they all share the same file-based continuity model.
4. Routines, not just projects
Some work never ends. To-do review, comms triage, meeting prep, recurring maintenance. Those are not projects, so the system treats them as routines/ rather than workstreams/. That lets the operating system handle finite deliverables and ongoing loops without forcing them into the same shape.
Why the file boundary matters
The most important design choice in the AI PM OS is not the model, or cmux, or even the Chief of Staff prompt. It is the decision to put continuity in files.
Chat history is fragile. Sessions end. Context windows fill up. Threads drift. But a short CONTEXT.md that gets refreshed as work moves forward is stable, inspectable, and easy to edit.
The public version
I recently sanitized and open-sourced a starter version of this system as ai-pm-os.
The public repo keeps the same core ideas, but simplifies the machinery. Instead of the full internal command-center event pipeline, it uses a lighter launch-plan model built around system/today-plan.json, with a derived state layer that mirrors the recommended queue into system/state/queue.json and system/state/now.json.
What I learned
The operating model matters more than the prompt. Better prompts help, but most of the leverage came from structuring the environment so the agent had a place to return to.
Separation of roles makes the system calmer. A planning workspace should plan. A workstream workspace should execute. A routine workspace should loop. Mixing them makes everything noisier.
Small files beat clever memory tricks. A concise CONTEXT.md is one of the highest-leverage AI UX improvements I've found.
The system became useful when it survived interruption. That's the real bar. Not "did the demo work," but "does this still make sense after a bad week?"
Related: How the AI PM OS Spins Up My Entire Workday and Why the AI PM OS Feels More Powerful Than a Chatbot.