Skip to main content

Overview

Mengram gives your AI three distinct memory types, inspired by how human memory works:
TypeStoresExample
SemanticFacts, knowledge, preferences”User prefers dark mode and uses Python 3.12”
EpisodicEvents, experiences, interactions”Fixed an OOM bug on Jan 15 by reducing pool size”
ProceduralWorkflows, processes, skills”How to deploy: 1) run tests, 2) build, 3) push to main”
When you call m.add(messages), all three types are extracted automatically from the conversation.

Semantic Memory

The knowledge graph. Entities with facts, types, and relationships. This is the core memory layer.
# Search semantic memory
results = m.search("user preferences")
# Returns entities with facts and scores

# Get a specific entity
entity = m.get("PostgreSQL")
# {"name": "PostgreSQL", "type": "technology", "facts": [...]}

Episodic Memory

Autobiographical events — what happened, when, with whom, and what the outcome was. Each episode has a summary, context, outcome, and participant list.
# Search episodes
events = m.episodes(query="deployment issues")
# [{"summary": "Fixed OOM on Railway", "outcome": "Resolved by reducing pool", ...}]

# List recent episodes
recent = m.episodes(limit=10)

# Time-range filter
jan_events = m.episodes(after="2026-01-01", before="2026-02-01")

Procedural Memory

Learned workflows and processes. Mengram extracts step-by-step procedures from conversations and tracks which ones work and which fail.
# Search procedures
procs = m.procedures(query="deploy")
# [{"name": "Deploy to Railway", "steps": [...], "success_count": 5}]

# Report success/failure — triggers experience-driven evolution
m.procedure_feedback(proc_id, success=True)

# On failure with context, the procedure evolves automatically
m.procedure_feedback(proc_id, success=False,
    context="Step 3 failed: OOM on build",
    failed_at_step=3)

# View how a procedure evolved over time
history = m.procedure_history(proc_id)
# {"versions": [v1, v2, v3], "evolution_log": [...]}

Graph RAG

Memories aren’t flat — they’re connected in a knowledge graph. When you search, Mengram doesn’t just find direct matches. It traverses relationships to surface context that simple vector search misses.
Query: "Python"
  → Direct match: Python (technology)
    → 1 hop: Django (framework) — "used_with" → Python
      → 2 hop: Railway (platform) — "deployed_on" → Django
This 2-hop traversal means searching for “Python” also returns your Django deployment setup on Railway — context that a flat vector search would never find. Control traversal depth with graph_depth:
m.search("Python", graph_depth=0)  # direct matches only
m.search("Python", graph_depth=2)  # default — 2 hops
m.search("Python", graph_depth=4)  # deep traversal

Ebbinghaus Decay

Mengram models memory decay like the human brain. Facts that haven’t been accessed recently fade in importance, while frequently recalled facts get stronger. The formula: effective_importance = base_importance * e^(-0.03 * days_since_access) + frequency_boost
  • A fact accessed yesterday has full weight
  • A fact untouched for 30 days decays to ~40% weight
  • A fact accessed 100 times gets a frequency boost regardless of recency
This means search results naturally prioritize fresh, relevant knowledge over stale information — without manual cleanup. The curator agent also periodically finds and archives facts that have become outdated or contradicted by newer information.

Confidence Scoring

Procedural memory uses confidence scoring to decide when to create new procedures:
ConfidenceAction
< 0.4Skip — not enough evidence
0.4 – 0.6Create a suggestion trigger — user decides
>= 0.6Auto-create the procedure
Confidence is calculated from episode clustering — how many similar events support the pattern. This prevents noisy one-off events from becoming permanent workflows. Search all three types at once with a single call:
results = m.search_all("deployment problems")
# {
#     "semantic": [...],    # knowledge graph entities
#     "episodic": [...],    # related events
#     "procedural": [...]   # relevant workflows
# }