-
Notifications
You must be signed in to change notification settings - Fork 3k
Description
Title: [ENHANCEMENT] Native Letta Provider Integration
Problem
While Roo Code is my daily driver for all things development, I have been using Letta for a while and absolutely love its rock-solid memory management capabilities. However, there is no native way to bring Letta's stateful agents directly into Roo Code. Using generic OpenAI-compatible endpoints doesn't cut it, as it loses the context of Letta's advanced, independent conversation threads and memory systems.
Context
As a developer who uses RooCode as my primary interface for agentic tasks, my own use-case is the main driver here: I want the best of both worlds. I prefer Roo Code’s UI and tool integration leaps and bounds over LettaCode, but I want to seamlessly connect Letta agents (whether hosted at app.letta.com or self-hosted locally) to leverage their long-term memory across my workspaces.
Desired behavior
Add a native "Letta" provider to Roo Code so users can connect directly to any Letta agent.
This integration should include:
- A dedicated Provider settings UI for Letta (Base URL, API key, Agent selector).
- Full support for Letta's Conversations (threads/shared memory).
- A "Conversation Mode" dropdown with three options:
- Manual Select: Dropdown of existing conversations + "Create New".
- Auto per Workspace: Automatically pick/create a conversation based on the current workspace/folder name and persist it.
- New per Roo Code Task: Auto-create a fresh conversation for each new task.
- Seamless tool execution, streaming where possible, and graceful error handling.
Constraints / preferences
The integration should feel exactly like using Letta Code inside Roo Code—zero friction, seamless tools, full state/memory, and no confusion. It should leverage the Letta TypeScript SDK or REST SSE to parse steps/tokens effectively.
Checklist
- I've searched existing Issues and Discussions for duplicates
- This describes a specific problem with clear context and impact
Acceptance criteria
Given the user configures the Letta provider with their Base URL and API Key
When they select an existing Letta Agent and set the conversation mode to "Auto per Workspace"
Then Roo Code creates or resumes a Letta conversation specific to that workspace
And the agent successfully executes Roo Code tools (like edit_file or run_command) while maintaining Letta's persistent long-term memory across sessions.
Proposed approach
- Add
lettaSchemato@roo-code/typesprovider-settings.ts. - Build a
LettaHandlerinsrc/api/providers/letta.tsthat implements Roo Code'sApiHandlerinterface and handles Letta's streaming responses (step format or token streaming). - Create
webview-ui/src/components/settings/providers/Letta.tsxwith dynamic fetching for agents and conversations. - Integrate the new provider into
ApiOptions.tsxand the coreClineProvider.tsmessage handler.
Trade-offs / risks
Connecting to Letta requires managing stateful conversations on Letta's end, which is slightly different from the stateless behavior of other LLM providers. Care must be taken to ensure Roo Code's task lifecycle maps correctly to Letta's conversation structure.