Vault: The Context Engine
A local-first neural network for your life. Vault progressively synthesizes your notes, code, and documents into a living intelligence layer.
Radical Privacy
Local-first architecture means your thoughts never leave your machine unless you say so. Zero telemetry. Zero cloud dependency.
Progressive Context Synthesis
Every note, highlight, and tag compounds into deeper AI understanding, building a continuously evolving second brain.
Multi-Model Intelligence
Bring your own model. Vault provides the context layer for OpenAI, Claude, Gemini, or local LLMs like Ollama, giving them access to your private knowledge.
The Stack
Built for performance, stability, and longevity.
Frontend Layer
- TauriNative desktop app with web technologies
- React + TypeScriptType-safe UI components
- TailwindCSSUtility-first styling system
Core Engine
- RustMemory-safe, high-performance core
- Axum + TokioAsync runtime for scalability
- CandleLocal LLM inference
Storage Layer
- Local FilesYour data stays on your machine, accessible via file system
- Private StorageEncrypted, user-controlled data stores
- Markdown & Structured DataPlain text knowledge storage with semantic structure
Designed for professionals.
Strategic Analysis
Connect disparate pieces of information across your entire knowledge base to reveal hidden patterns and actionable insights.
Workflow
- 1.Ingest market reports, meeting notes, and data
- 2.Vault maps semantic relationships between entities
- 3.Identify trends invisible in isolated documents
Creative Synthesis
Overcome writer's block by using your past thoughts as raw material for new ideas. Your vault becomes an active partner in creation.
Workflow
- 1.Query your vault for related concepts
- 2.Generate drafts based on your unique perspective
- 3.Refine ideas with context-aware suggestions
Start building your vault.
Join the beta. Free for personal use. Open source forever.