Our PracticeTeamInvestorsGet the Deck
← All Field Notes
April 24, 2026desktoparchitecturedistributionlocal-first

April 2026: Your Files. Your Machine. Your AI.

We shipped a signed, self-updating Windows desktop application that gives an AI model direct read/write access to your local filesystem — no cloud, no third-party data handling. This is a note about the distribution pipeline nobody talks about, and why local-first is the only bet worth making.

There's a version of AI that lives in the cloud, knows nothing about you, and forgets everything after the tab closes.

We built the other version.

What Shipped

This month we crossed a milestone that's harder than it sounds: a Windows desktop application that distributes itself, updates itself, and gives an AI model direct read/write access to your local filesystem — all without the user's data touching a third-party server.

JARVIS Desktop is a tabbed file workspace. Open a file. The AI sees it. Ask about it, edit it, convert it. The model isn't guessing at what might be in your folder — it actually has the path. It actually reads the bytes.

That's a different class of tool.

The Distribution Problem

Getting a signed, self-updating Windows installer is one of those tasks that looks simple until you're three days into code-signing certificate chains and build pipelines.

Our pipeline today looks like this: code lives on HomeBase, our Linux build server. One command syncs the project to our Windows build node, compiles the installer, signs it with our certificate, pulls the binary back to HomeBase, drops it in the update server, and bumps the version manifest. The download page reads the manifest. The installed app checks the manifest every four hours.

When we ship a fix, users get it without thinking about it.

That's the infrastructure work nobody talks about. Everyone demos the AI part. Almost nobody ships the boring delivery layer that makes a product actually function for people over time.

Why Local-First Matters

Every cloud-based AI tool makes the same implicit promise: we'll hold your data, we'll keep it safe, trust us.

We made the opposite bet.

When JARVIS reads your files, it reads them off your disk. When it writes something, it writes to your drive. The AI has a memory system — a profile, a journal, a registry of people and projects you've mentioned — and that lives in a folder you own. You can open it in any text editor. You can move it to Dropbox and it follows you. You can delete it and start over.

The AI gets smarter about you over time. You keep total ownership of everything it learns.

That's the property we refused to give up.

What the Interface Actually Is

The workspace has a file tree, a tabbed editor, and a chat panel. Those three things sound simple. But the combination is the point.

You're writing code — Python, JavaScript, HTML. The AI sees the file. You ask "what does this function do" — it reads it and tells you. You say "refactor the error handling" — it edits the file directly, then switches your tab to the result. For HTML specifically, you can toggle between source and a live rendered preview in the same tab. You never leave the window.

You're working on documents. Drop a PDF in. The AI reads it page by page. Open a Word doc, an Excel sheet, a plain text file — same experience. Ask for a summary, a comparison, a rewrite in plain English. It writes a new file. One window.

You're looking at an image. The AI sees it too. Ask what's in it, describe the composition, get a critique, use it as reference. The conversation happens in context of what you're both looking at.

You're watching a video. Pause it. The AI sees the freeze frame. "What's happening here?" — and it actually knows, because it's looking at the same frame you are.

The vision is that this becomes the surface where you work and where you control the intelligence that helps you work. Same UI for both. We're not there fully yet — but the core is live and the architecture supports it.

The Honest State of It

This is internal distribution, not a public launch. The installer is signed with our certificate. Users who know us install it knowing what it is.

The auto-update pipeline works. The local file access works. The memory system is live. There's real work left on the edges — richer file type support, tighter AI tooling for specific workflows.

We build in production. The app that exists today is real software with real users, not a Figma prototype with a waitlist.


If you want to see what local-first AI infrastructure actually looks like — not the pitch, the working system — reach out.