Skip to content

[ENHANCEMENT] Native Letta Provider Integration #11735

@Jdo300

Description

@Jdo300

Title: [ENHANCEMENT] Native Letta Provider Integration

Problem

While Roo Code is my daily driver for all things development, I have been using Letta for a while and absolutely love its rock-solid memory management capabilities. However, there is no native way to bring Letta's stateful agents directly into Roo Code. Using generic OpenAI-compatible endpoints doesn't cut it, as it loses the context of Letta's advanced, independent conversation threads and memory systems.

Context

As a developer who uses RooCode as my primary interface for agentic tasks, my own use-case is the main driver here: I want the best of both worlds. I prefer Roo Code’s UI and tool integration leaps and bounds over LettaCode, but I want to seamlessly connect Letta agents (whether hosted at app.letta.com or self-hosted locally) to leverage their long-term memory across my workspaces.

Desired behavior

Add a native "Letta" provider to Roo Code so users can connect directly to any Letta agent.
This integration should include:

  • A dedicated Provider settings UI for Letta (Base URL, API key, Agent selector).
  • Full support for Letta's Conversations (threads/shared memory).
  • A "Conversation Mode" dropdown with three options:
    1. Manual Select: Dropdown of existing conversations + "Create New".
    2. Auto per Workspace: Automatically pick/create a conversation based on the current workspace/folder name and persist it.
    3. New per Roo Code Task: Auto-create a fresh conversation for each new task.
  • Seamless tool execution, streaming where possible, and graceful error handling.

Constraints / preferences

The integration should feel exactly like using Letta Code inside Roo Code—zero friction, seamless tools, full state/memory, and no confusion. It should leverage the Letta TypeScript SDK or REST SSE to parse steps/tokens effectively.

Checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear context and impact

Acceptance criteria

Given the user configures the Letta provider with their Base URL and API Key
When they select an existing Letta Agent and set the conversation mode to "Auto per Workspace"
Then Roo Code creates or resumes a Letta conversation specific to that workspace
And the agent successfully executes Roo Code tools (like edit_file or run_command) while maintaining Letta's persistent long-term memory across sessions.

Proposed approach

  1. Add lettaSchema to @roo-code/types provider-settings.ts.
  2. Build a LettaHandler in src/api/providers/letta.ts that implements Roo Code's ApiHandler interface and handles Letta's streaming responses (step format or token streaming).
  3. Create webview-ui/src/components/settings/providers/Letta.tsx with dynamic fetching for agents and conversations.
  4. Integrate the new provider into ApiOptions.tsx and the core ClineProvider.ts message handler.

Trade-offs / risks

Connecting to Letta requires managing stateful conversations on Letta's end, which is slightly different from the stateless behavior of other LLM providers. Care must be taken to ensure Roo Code's task lifecycle maps correctly to Letta's conversation structure.

Metadata

Metadata

Assignees

No one assigned

    Labels

    EnhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions