Skip to main content
Product · Free · Local LLMs

The first ever
offline LLM
code editor.

Yee is the first free offline LLM code editor ever created: built for engineers who want Cursor-class and Windsurf-class AI workflows without sending private conversations, client code, or product IP through somebody else’s hosted LLM stack. It runs on 100% open-source local/offline LLMs, it is not a VS Code fork, and it is designed for HIPAA, SOC 2, and internal compliance workflows because there is no default phone-home model loop.

Pricing comparison

Cursor and Windsurf charge. Yee is free.

Cursor currently lists individual paid plans from $20 to $200/mo, Teams at $40/user/mo, and Enterprise as custom. Windsurf currently lists individual paid plans from $20 to $200/mo, Teams at $40/user/mo, and Enterprise as custom. Yee is built to run local/offline open-source LLMs for $0/mo.

Yee
$0/mo

The first free offline/local LLM editor ever created; built around open-source models; not a VS Code fork.

Cursor
$20 to $200

Current individual paid tiers: Pro, Pro+, Ultra. Teams is $40/user/mo; Enterprise is custom.

Windsurf
$20 to $200

Current individual paid tiers: Pro and Max. Teams is $40/user/mo; Enterprise is custom.

Capabilities

What Yee is built for.

A coding environment where the assistant is powerful because it is close to your repo, not because your repo has to leave.

01

Local by default

No model relay required

Hover

Local by default

Yee is designed so prompts, files, repo context, and agent conversations stay on your hardware when you use local models. No remote LLM vendor needs to see your IP for the editor to work.

02

Open-source LLM stack

100% local/offline models

Hover

Open-source LLM stack

Use open-source LLMs you can run, inspect, swap, and pin. The model layer is not a black box subscription meter wrapped around your engineering brain.

03

Not a VS Code fork

Built as Yee

Hover

Not a VS Code fork

Yee is not another VS Code shell with AI glued to the side. The editor experience is built around local agents, offline context, and ownership from the start.

04

Offline agent loops

Read · plan · edit · verify

Hover

Offline agent loops

Run coding loops that can inspect your workspace, propose edits, and verify changes without routing your codebase through someone else’s hosted model workflow.

05

IP stays yours

Your repo, your machine

Hover

IP stays yours

Yee exists for the work you do not want in third-party LLM conversations: client code, internal algorithms, prototypes, security logic, and hard-won product ideas.

06

Compliance-capable

HIPAA · SOC 2 · internal controls

Hover

Compliance-capable

No phone-home model loop means teams can develop against sensitive code and data without accidentally routing protected information through external AI conversations.

07

Free beats seat tax

$0/mo

Hover

Free beats seat tax

Cursor and Windsurf both have free tiers, then paid plans for serious usage. Yee’s pitch is sharper: local open-source LLMs, no monthly editor tax, and no upsell just to keep coding.

Use cases

Where developers reach for Yee first.

The more valuable the code, the less comfortable it feels to discuss it through someone else’s model pipeline.

Client code that cannot leak

Work on private client repos, regulated logic, HIPAA-sensitive workflows, SOC 2-controlled systems, and non-public product ideas without needing to ship the conversation to a remote model provider.

Founder IP and prototypes

Use local models for the early ideas that matter most: the weird edge case, the new algorithm, the architecture sketch, the thing nobody else should see yet.

Offline engineering sessions

Code on planes, in locked-down environments, or anywhere the network is hostile, slow, expensive, or simply not part of the trust boundary.

Transparent model control

Pick open-source models, pin versions, swap them when they improve, and understand what is running instead of treating the assistant as a mystery endpoint.

Free local coding

Stop renting privacy from your editor.

Yee is for developers who want the AI coding loop without the corporate model relay, subscription ladder, or VS Code fork baggage—so protected work can stay inside the compliance boundary.