Local Time
SF

Memory for . Trainable, versioned recall you integrate like reasoning, speech, or vision.

pip install 9dlabs

Memory,
as models.

What we ship

One memory plane for intelligent systems that hold up over time.

[1] Persistent coherence

One intelligible past across sessions, failures, upgrades, and long runs. Prefer memory that lasts over a cache acting like history.

[2] One substrate, many runtimes

Same layer under agents, workers, and sim stacks. Integrate once instead of rebuilding retrieval and state plumbing in every project.

[3] Compound intelligence

Recall compounds through promotion, revision, and consolidation. Improve memory as the system runs; do not let retrieval flatten as the pile grows.

[4] Governed crossings

What enters context is chosen, bounded, and explainable. Policies and budgets steer autonomy instead of letting it drift.

[5] Trust at machine speed

Show what crossed into the model and why. Ship lineage and receipts for changes, eligibility, and packing so ops can audit memory like any other production surface.

[6] Prove it, then ship it

Regression tests and comparisons before you bump models or policies. Memory failures are slow to notice and expensive to unwind.

Research Areas

Memory for intelligent systems, not just chatbots.

01 Memory for Agents Shipping

A unified memory field built for agents that are meant to run a long time—one deep representation space where what they learn can stack, sharpen, and stay coherent instead of dissolving into the next context window. Governable by design, so memory stays a lever you can shape: bounded, intentional, and legible when it matters.

  • Long-horizon recall that compounds—meaning, phrasing, and time live in the same plane so memory accretes instead of flattening into generic similarity
  • Hard ceilings with visible tradeoffs—token budgets force honest choices about what crosses into the model, not silent loss
  • Scoped isolation at scale—local, shared, and durable tiers with namespaces so teams and products stay separate without splitting the substrate
  • Decision receipts on what enters the prompt—operators can trace crossings end to end: what was considered, what won, and why
02 Memory for Video Generation Research

Temporal coherence memory for generative video models. Maintaining character identity, scene continuity, and physical state across hundreds of frames without drift.

  • Persistent entity representations across frame sequences
  • Scene-graph memory for spatial and relational consistency
  • Policy-gated injection into diffusion conditioning
  • Deterministic replay for debugging visual artifacts
  • Designed for Sora-class and open-source video pipelines
03 Memory for World Models Research

Persistent state layers for world-simulation agents. Enabling models to maintain and update beliefs about environments over long horizons — from game worlds to robotic planning.

  • Belief-state memory that updates with new observations
  • Multi-scale temporal abstraction: frames, episodes, lifetimes
  • Governed state transitions with audit trails
  • Integration with reinforcement learning and planning loops
  • Targeting embodied agents and open-ended environments

Use Cases

Memory APIs

  • Store durable context once, retrieve it everywhere an agent works.
  • Keep recall scoped, traceable, and controlled by policy.
See Docs
memory.py
from nined.memory import Memory

memory = Memory(api_key="your-key", workspace="sre")

memory.add(
    "Restart pods on OOM, then page on-call.",
    title="P0 Runbook",
    policy="prod-read",
)

ctx = memory.search("How do I handle OOM?")

What teams say

Governed memory in production.

"9D makes memory feel like a primitive, not a pile of glue code. Context just works."

Founding AI Engineer

Autonomous systems team

"The audit trail changes how we ship. Recall is no longer invisible state."

Platform Lead

Enterprise AI group

"Memory as models is the missing interface between product context and model behavior."

Research Partner

Applied intelligence lab

1 / 3