The Universal
Context Compiler

Compile any document into AI-ready knowledge bases with built-in agent memory. One command. Zero friction.

Terminal
$ pip install auralith-aura
$ aura compile ./my_docs/ --output knowledge.aura --pii-mask
🔥 Aura Compiler v0.1.0
Scanning files... found supported files
PII masking: enabled
Compiling: [████████████████████████████] done
✅ Compiled → knowledge.aura
CAPABILITIES

Everything You Need

Universal Ingestion

60+ file formats. PDFs, DOCX, HTML, Markdown, code in 20+ languages, spreadsheets, emails, presentations.

Agent Memory

3-tier memory OS with instant writes. Pad, episodic, and fact tiers give your agents persistent context.

Instant RAG

Query any document by keyword or ID. Native wrappers for LangChain and LlamaIndex. Zero-config retrieval.

PII Masking

Automatically redact emails, phone numbers, and SSNs before compilation. Privacy by default.

Secure by Design

No pickle. No arbitrary code execution. Uses safetensors + msgpack. Safe to share.

Cross-Platform

macOS, Windows, and Linux. Pure Python with zero OS-specific dependencies.

ARCHITECTURE

One File. Two Superpowers.

The same .aura archive works for RAG retrieval and agent memory.

RAG LangChain / LlamaIndex
from aura.rag import AuraRAGLoader

loader = AuraRAGLoader("knowledge.aura")

# Instant text retrieval
text = loader.get_text_by_id("doc_001")

# Framework wrappers
docs = loader.to_langchain_documents()
docs = loader.to_llama_index_documents()
Memory 3-Tier Memory OS
from aura.memory import AuraMemoryOS

memory = AuraMemoryOS()

# Write to memory tiers
memory.write("fact", "User prefers dark mode")
memory.write("episodic", "Discussed API design")
memory.write("pad", "TODO: check auth module")

# Search across all memory
results = memory.query("user preferences")
INTEGRATIONS

Works With Your Stack

Native integrations for the AI agent platforms you already use.

Also works with:
LangChain LlamaIndex Memory OS
GET STARTED

Up and Running in 60 Seconds

01

Install

$ pip install auralith-aura
02

Compile

$ aura compile ./my_data/ --output knowledge.aura --pii-mask
03

Use

from aura.memory import AuraMemoryOS
memory = AuraMemoryOS()
memory.write("fact", "User prefers dark mode")
results = memory.query("preferences")
LOCAL-FIRST

Your Data Never Leaves Your Machine

Aura compiles entirely on your local hardware — no cloud uploads, no external APIs, no telemetry. Your documents stay yours.

Runs on Your Hardware

Compiles on any modern laptop or desktop. Your machine, your setup — no cloud dependency.

Fully Offline

Zero internet required after install. Compile sensitive documents without any data leaving your network.

Cross-Platform

macOS, Windows, and Linux. Python 3.8+ — runs wherever your team works.

SCALE UP

Need More? Meet OMNI

Aura handles local compilation. For enterprise-scale training pipelines, model fine-tuning, and production-grade agent infrastructure — there's OMNI.

The OMNI Platform

  • Cloud-scale data compilation & training pipelines
  • Supervised model fine-tuning with emphasis weighting
  • Production agent memory infrastructure
  • Team collaboration & enterprise compliance
Explore OMNI →

Ready to Compile Your Context?

Open source. Cross-platform. Zero friction.