Waykee
Cortex — Studies
Waykee Cortex · Study

Karpathy's LLM Wiki vs. Waykee Cortex

April 2026 · 12 min read

White Paper Comparison Context Engineering

The Problem Both Systems Solve

In April 2026, Andrej Karpathy — former AI Director at Tesla and co-founder of OpenAI — published llm-wiki.md, a pattern for building personal knowledge bases using LLMs. His key insight resonated with the entire AI community:

Karpathy's Insight "The correct way to use LLMs is not Q&A, it's compilation. The wiki is a persistent, compounding artifact. The cross-references are already there. The contradictions have already been flagged."

Both Karpathy's LLM Wiki and Waykee Cortex reject the dominant paradigm of treating AI as a search engine (ask a question → retrieve fragments → generate answer → forget everything). Both believe that knowledge should compound, not be re-discovered on every query.

But they take fundamentally different approaches to solving this problem. Understanding the difference helps teams choose the right tool — or combine both.

Architecture: Three Layers vs. Three Levels

Both systems are organized in three layers, but the nature of each layer differs significantly.

Karpathy's LLM Wiki

1
Raw SourcesImmutable documents, papers, data
2
The WikiLLM-compiled markdown pages
3
The SchemaCLAUDE.md / AGENTS.md config

Waykee Cortex

1
Knowledge HierarchySystem → Module → Screen
2
BotContext EngineInherited context with metadata
3
Instruction PacksReusable, assignable rule sets

Layer 1: Where Knowledge Lives

Karpathy's raw/ directory is a flat collection of immutable source documents — PDFs, articles, datasets, images. The human curates what goes in; the LLM never modifies these files. They are the "source of truth."

Waykee's Knowledge hierarchy organizes documentation in a structured tree: System → Module → Screen/Program. Each node has rich metadata (GitHub repos, ports, file paths), HTML documentation for humans, and plain-text BotContext optimized for AI consumption. Knowledge is not immutable — it's a living record that gets updated when the system it describes changes.

Key Difference Karpathy's sources are inputs to be compiled. Waykee's knowledge is the compiled output itself — maintained by both humans and agents, always reflecting the current state of what has been built.

Layer 2: How Knowledge Gets Delivered

Karpathy's Wiki layer is the crown jewel of his system. The LLM reads raw sources and compiles them into interlinked markdown pages — summaries, entity profiles, comparison tables, synthesis articles. A central index.md catalogs every page. When you ingest a new source, the LLM touches 10–15 wiki pages, updating cross-references and resolving contradictions.

Waykee's BotContext engine dynamically assembles context on demand. When an agent calls GetBotContext for any waykee (task, screen, module), the engine walks up the hierarchy, collecting documentation and metadata from each level. A single API call returns the complete context chain: the screen's docs, its parent module's docs (with GitHub repo and port), the parent system's docs, plus conversation history and instruction packs.

// Karpathy: Agent navigates wiki manually 1. Read index.md find relevant page 2. Read page.md follow cross-references 3. Read 2-3 more pages synthesize answer // Total: 3-5 file reads per query // Waykee: One API call, full context 1. GetBotContext(waykeeId) complete hierarchy + metadata + conversation + packs // Total: 1 call. Context is pre-assembled.

Layer 3: How Behavior Gets Configured

Karpathy's Schema (CLAUDE.md or AGENTS.md) is a configuration file that defines wiki structure, conventions, and workflows. It co-evolves with the human over time — a single document that shapes how the LLM interacts with the wiki.

Waykee's Instruction Packs are modular, reusable instruction sets that can be independently assigned to different agents. A "Developer Guidelines" pack, a "UTC DateTime Rules" pack, and a "Deployment Discipline" pack can be mixed and matched. Agent A gets packs 1 and 3; Agent B gets packs 1 and 2. Each pack is version-controlled and editable without affecting the others.

In Practice Karpathy's schema is like a single .env file for the whole wiki. Waykee's packs are like microservices — independently deployable units of behavior that compose together.

Operations: Ingest/Query/Lint vs. Context/Work/Document

Karpathy defines three core operations for his wiki: Ingest (process new sources), Query (search and synthesize answers), and Lint (health-check the wiki for contradictions and gaps).

Waykee's workflow follows a different cycle: Get Context (receive full hierarchy), Do Work (write code, create tasks, send messages), and Update Documentation (reflect changes in Knowledge so future agents inherit the current state).

Aspect Karpathy LLM Wiki Waykee Cortex
Adding knowledge Drop file in raw/, run Ingest. LLM compiles into wiki pages. Create/update documentation in hierarchy. BotContext auto-inherits.
Querying Agent reads index.md, navigates pages, synthesizes answer. One API call (GetBotContext) returns full context. Also: RAG semantic search across all waykees.
Quality control Periodic Lint operation finds contradictions, orphan pages, stale claims. Hierarchy enforces structure. Dual-parent relations track history. No orphan pages possible.
Cross-referencing LLM maintains markdown links between wiki pages during ingest. Parent-child relations are explicit (stored in DB). Automatic inheritance, no broken links.
Collaboration Single user (personal knowledge base). Shared via git repo. Multi-user with permissions. Humans and bots as first-class participants. Subscription-based access.
Task management Not included. Wiki is knowledge-only. Integrated. Tasks live alongside documentation with dual-parent (Knowledge + Work dimensions).
Communication Not included. Built-in chat per waykee. Messages become part of context. Email integration for external users.
Persistence Markdown files on disk. Version-controlled via git. SQL Server with full audit trail. Version history on documentation. Embedding-based RAG index.

Philosophy: Personal Compiler vs. Team Context Engine

The deepest difference is philosophical. Karpathy's LLM Wiki is a personal knowledge compiler. It's designed for one person (or one agent) to accumulate and synthesize knowledge over time. The human curates sources; the LLM compiles and maintains the wiki. It's elegant, minimal, and self-contained.

Waykee Cortex is a team context engine. It's designed for organizations where multiple humans and multiple AI agents collaborate on shared work. Knowledge isn't compiled from raw sources — it's constructed through work and maintained as part of the workflow.

The Construction Metaphor Waykee uses a deliberate analogy: Knowledge is the construction (the finished building — clean, navigable, presentable). Work is the construction site (scaffolding, dust, attempts — necessary but temporary). When a task finishes, it's closed. The result stays in Knowledge. Karpathy's wiki is closer to a personal notebook that gets continuously refined.

Where Karpathy Shines

Where Waykee Cortex Shines

Can You Use Both?

Absolutely. The two systems address different scopes and can complement each other:

Technical Comparison

Dimension Karpathy LLM Wiki Waykee Cortex
Storage Markdown files on disk SQL Server + Azure Blob Storage
Search LLM navigates index.md + page links RAG semantic search (OpenAI embeddings) + hierarchy traversal
Access control Filesystem permissions / git access Per-waykee subscription model with admin roles
Real-time None (batch processing) SignalR for live updates and bot streaming
API None (file-based) REST API (90+ endpoints) + MCP Server
Platforms Any text editor + LLM (Obsidian recommended) Web + Mobile (iOS/Android) + API
Setup time Minutes (copy gist, start ingesting) Hours (deploy server, configure hierarchy)
Cost LLM API tokens only Server infrastructure + LLM tokens
License Public gist (open) AGPL v3 (open source)

Conclusion

Karpathy's LLM Wiki and Waykee Cortex share a conviction: knowledge should compound, not be re-discovered. They diverge on scope. Karpathy built a beautiful pattern for a single mind. Waykee built an engine for entire teams.

If you're a solo developer or researcher accumulating expertise, Karpathy's pattern is hard to beat for simplicity and elegance. If you're running a team where multiple humans and AI agents need shared context, permissions, task tracking, and integrated communication — that's where Waykee Cortex earns its complexity.

The real insight from both systems is the same: stop treating AI as a search engine. Build persistent knowledge structures. Let the AI maintain them. Whether it's a compiled wiki or a hierarchical context engine, the future of AI knowledge management is compilation, not retrieval.

Try Waykee Cortex

Open source, model-agnostic hierarchical context engine for AI agents. One API call, full organizational context.

Register for Early Access →