bouzaiene.org
back to blog
·2 min

Why BinaryClerk

Cloud AI assistants are great until you notice where your data lives and what they can actually touch. Here's what I'm building instead.

BinaryClerk is a local-first desktop AI coworker for people who live in the browser. One app on your machine that chats with capable models, drives your real Chrome through a companion extension, and keeps projects, chats, and files on disk (SQLite) instead of in someone else's cloud.

That's the pitch. Here's the why.

The cloud assistant has a ceiling

The current generation of AI assistants is mostly a chat box bolted onto someone else's tab. You paste in what you're working on. The assistant guesses. You paste again. It's a remarkable demo and a frustrating workflow.

The two ceilings:

  1. They can't see what you're doing. Your tabs, your local files, your half-written notes — invisible. So you become the bridge, manually shuffling context between you and the model.
  2. They store everything. Chats, attachments, embeddings, sometimes more. Even when the privacy policy is clean, the architecture isn't: your work lives on infrastructure you don't control, and the assistant's "memory" of you is on someone else's disk.

You can paper over (1) with browser extensions and over (2) with privacy promises. But you can't really solve either inside a cloud-only product, because the product's whole shape assumes the cloud is where work happens.

What "local-first" actually means here

Local-first isn't "we made an Electron wrapper." It's a stance about where the source of truth lives.

In BinaryClerk:

  • Your conversations, memories, saved workflows, and usage data live in a local SQLite database on your machine.
  • The agent runs in a desktop app you installed. It calls models with your own API keys (BYOK), stored in the OS keychain.
  • A companion Chrome extension pairs with the desktop app over a localhost WebSocket. The agent sees and acts in your real browser — with read-only mode, URL blocklists, approvals, and per-tool policy.
  • The only thing that leaves your device is what you explicitly send to a model.

That's the whole shape. Not a sync layer with optional offline. Local first, network second.

Where it goes

The interesting space isn't "ChatGPT but offline." It's the kind of agent you only get when you stop pretending the assistant is a stranger:

  • Saved workflows that run on a schedule.
  • MCP tool servers running locally that the agent can reach for, with your permission.
  • Optional shell tools for power users who opt in explicitly.
  • Browser automation that doesn't pretend to be a fresh user — it's you, logged in, with your sessions.

If you're a builder who wants Cursor-style depth for browser + local data instead of another generic chat tab, that's who I'm building this for.

The waitlist is open on the landing page. I'll post here as it comes together.

if this was useful

I post short notes like this when I learn something building agents.