WL
WorkBox Labs
Local AI · Optical Containment
Independent AI R&D

A workstation-based AI worker for coding, analysis, documentation, and real business operations.

WorkBox Labs explores how far local AI can go when it has the freedom to think — but operates on hardware you own, and actions you can approve or reject in plain language.

Local-only workloads
No hidden network channels
Human-approved actions
Optical one-way output (GlassBox)

What WorkBox Labs is building

The long-term goal is simple: show that serious, high-value work can be done by AI that lives on your own machines, with transparent limits and visible, reversible actions.

WorkBox AI
A workstation-based AI worker for coding, analysis, documentation, and real business operations tasks — all on hardware you own.
Workstation

WorkBox AI behaves more like a focused software engineer and operations assistant than a chatbot.

It streams its desktop, takes on structured tasks, and works end-to-end on code, documents, prototypes or planning workflows — with every input and file optically ingressed, sanitized, and under your control.

GlassBox Isolator
A research prototype that keeps AI behind glass with a single, visible output channel.
Research

The GlassBox Isolator runs a powerful local model in an Isolator with no network, no keyboard and no mouse. The model can “think” freely, but it can only propose actions by emitting a Human Accountability Ticket (HAT) as a QR code.

A separate console decodes each HAT, shows the full plan, and requires human approval before anything runs on a separate machine. Every step is logged.

Freedom for the model. Boundaries for the system.

Today’s AI tends to live at two extremes: tightly sandboxed tools that can’t help much, or highly-connected agents you can’t fully see. WorkBox Labs explores a middle path: powerful local AI workers, wrapped in visible, human-first safety rails.

  • Local-first, not cloud-first. Your data, code and files stay on machines you control.
  • Visible actions. GlassBox uses HATs so every action is described in plain JSON and UI before it happens.
  • Human approval baked in. The console shows previews, tiers and audit trails so you decide what runs.
  • No hidden backchannels. The Isolator exposes only a one-way optical output — there is no inbound control path.
  • Designed for real work. WorkBox AI focuses on end-to-end tasks, not just conversation snippets.
Where things are up to

The current prototypes are already capable of generating HATs, displaying QR bursts, decoding them on a main console and running approved actions with logging in place.

Next milestones include refining the WorkBox AI worker loop, polishing the GlassBox demo site and shaping early, carefully-scoped pilot use cases.

GlassBox demo: live
WorkBox AI: prototype
Pilots: in planning