Available on npm, VS Code Marketplace & Homebrew

Your AI never
forgets again

Remb is the persistent memory layer for AI coding sessions. Context, decisions, and patterns survive across conversations — so every session picks up right where the last one ended.

Terminal — remb

$ remb init

✔ Project "my-app" registered

✔ GitHub connected, scanning 132 files...

✔ 24 features extracted, 8 core memories saved

$ remb context

→ 8 core memories loaded

→ 3 recent conversations restored

→ Architecture: Next.js + tRPC + Prisma

$ AI now has full project context ✨

The Problem

Every chat starts from zero

AI assistants lose all context between conversations. You repeat yourself, re-explain architecture, and watch your AI make the same mistakes you already corrected.

Repeated Explanations

"This project uses Next.js App Router with server actions..." — typed for the 50th time in a new chat window.

Lost Decisions

You spent 2 hours deciding on a state management approach. New session? AI suggests the pattern you explicitly rejected.

No Project Awareness

AI doesn't know your folder structure, naming conventions, or which libraries you use. Every answer is generic.

Broken Continuity

Yesterday you built the auth system together. Today the AI asks "what authentication approach would you like to use?"

Remb solves all of this. Automatically.

Features

Everything your AI needs to remember

Core

Tiered Persistent Memory

Memories are organized in three layers: Core knowledge loads automatically every session, Active context surfaces on-demand when relevant, and Archive stores historical decisions for long-term recall. Nothing is ever lost.

coreactivearchive

Codebase Scanning

Connects to GitHub, scans your repo, and extracts features, dependencies, and architectural patterns automatically.

Conversation Continuity

Every session is logged — what was discussed, what was built, key decisions. Your next session starts with full history.

CLI

Full-Featured CLI

Written in Go for instant startup. Scan repos, manage memories, save context, link features, start the MCP server — all from your terminal. Ships as a single binary with zero runtime dependencies.

remb initremb scanremb contextremb memoryremb saveremb serve

Cross-Project Search

Search memories and patterns across all your projects. Say "do it like in project X" and your AI pulls the relevant context instantly.

Secure by Default

API keys stored with chmod 600. OAuth PKCE for browser login. Scoped tokens per project. Your context is yours — never shared across accounts.

Model Context Protocol

Native MCP integration

MCP is the open protocol that lets AI assistants connect to external tools and data. Remb exposes your project memory as a first-class MCP server — meaning Claude, Cursor, Windsurf, and any MCP-compatible client can access your context natively.

20+ MCP Tools Exposed

Load context bundles, search memories, log conversations, trigger scans, manage projects — all as MCP tool calls your AI can invoke autonomously.

Remote SSE Server

Access via https://mcp.useremb.com/sse — no local binary needed. Just add the URL to your MCP client config and you're connected.

Local stdio Mode

Run `remb serve` for a stdio MCP server — faster, offline-capable, works with any MCP client that supports local processes.

Auto-Session Protocol

On every session start, Remb auto-loads project context and conversation history. Your AI starts every chat already knowing your codebase.

Claude Desktop / Cursor

claude_desktop_config.json
{
  "mcpServers": {
    "remb": {
      "url": "https://mcp.useremb.com/sse"
    }
  }
}

Local stdio (offline)

mcp config
{
  "mcpServers": {
    "remb": {
      "command": "remb",
      "args": ["serve", "--project", "my-app"]
    }
  }
}
Claude DesktopCursorWindsurfVS Code CopilotAny MCP Client

How It Works

Three steps to permanent context

01

Connect & Scan

Point Remb at your GitHub repo. It analyzes your entire codebase — folder structure, frameworks, dependencies, feature boundaries, and architectural patterns. The result is a structured knowledge graph of your project.

$ remb init

✔ Connected to github.com/you/my-app

✔ Scanning 247 files across 12 directories...

✔ Extracted 31 features, 5 service boundaries

✔ Identified: Next.js 15, Prisma, tRPC, Tailwind

02

Learn & Remember

As you code with AI, Remb captures decisions, patterns, and context. Memories are tiered by importance — critical architectural decisions in core, feature-specific notes in active, historical context in archive.

$ remb save -f auth -c "Using PKCE OAuth with refresh token rotation"

✔ Saved to auth (core tier)

$ remb memory list --tier core

1. Auth: PKCE OAuth with refresh rotation

2. DB: Prisma with connection pooling via PgBouncer

3. State: Zustand for client, server actions for mutations

03

Auto-Load Every Session

When your AI starts a new conversation, Remb's MCP server automatically injects your project context, recent conversation history, and all relevant memories. No copy-pasting, no re-explaining — your AI already knows.

$ # AI's first action in every new chat:

→ remb_loadProjectContext()

Loading 8 core memories...

Loading 3 recent conversations...

Loading feature map (31 features)...

→ AI is now fully context-aware ✨

Get Started

Install in seconds

Four ways to install. One powerful context layer.

curl

Zero dependencies

curl -fsSL https://useremb.com/install.sh | sh
npm

Node.js CLI

npm install -g remb-cli
Homebrew

macOS & Linux

brew tap useremb/remb && brew install remb
VS Code

Extension

ext install remb.remb

Quick Start Guide

From zero to full AI context in 4 steps.

1

Install the CLI

The Go binary is fastest — single binary, no runtime. Or use npm if you prefer.

curl -fsSL https://useremb.com/install.sh | sh
2

Authenticate

Opens your browser for OAuth login. Creates a scoped API token for CLI and MCP access.

remb login
3

Initialize your project

Connects your GitHub repo, scans the codebase, extracts features and architecture into structured memory.

remb init
4

Connect your AI

Add Remb as an MCP server in Claude Desktop, Cursor, or Windsurf. Context injection is now automatic.

// Add to your MCP client config:
{
  "mcpServers": {
    "remb": {
      "url": "https://mcp.useremb.com/sse"
    }
  }
}

Stop repeating yourself to your AI

Set up Remb once. Every AI conversation from here forwards starts with full project context.