A self-learning AI runtime that turns conversations into reusable knowledge.
FAKR is a local-first AI runtime that:
- Tracks design sessions with an LLM across multiple conversations
- Automatically extracts skills (code patterns) and reasoning patterns (thinking styles)
- Builds a growing memory layer that improves over time
- Runs entirely locally with AnythingLLM (Ollama Mutli-agent WIP for faster training)
Think of it as session memory for AI pair programming — instead of starting from scratch every time, FAKR remembers what worked before and starts from there.
# 1. Clone the repo
git clone https://github.com/LogoASeguir/fakr-cli-framework
cd fakr-cli-framework
# 2. Install dependencies
pip install -r requirements.txtOpen runtime/model_client.py and edit lines 11-13 with your AnythingLLM settings:
API_BASE_URL = "http://localhost:3001/api/v1"
API_KEY = "your-api-key-here"
WORKSPACE_SLUG = "your-workspace-name"
Run
python main.py-Hello! Help me build a calculator # Start a new design
-Instructions # Start the conversation
-[... work through the design ...] # Iterate on the problem
-:freeze calculator_v1 # Save session to memory
-:skills # View learned skills
-:new # Start fresh — FAKR recalls past skills
:freeze [label] # Save current session to memory
:new # Start a new design session
:skills # List learned skills
:skill_show <id> # View skill details
:patterns # List reasoning patterns
:pattern_show <id> # View pattern template
:mpm [n] # View last n memory moments
:embryo # Check self-tuning state
:help # Full command list
FAKR is organized into five layers:
---------------------------------------------------------------
Runtime (interaction loop + clocks)
ModelClient (LLM routing layer)
Memory System (MPM, SkillStore, PatternStore, ContractStore)
Temporal Control (ClockState)
Self-Modulation Core (EmbryoCore)
---------------------------------------------------------------
Runtime (interaction loop + clocks)
↓
ModelClient (AnythingLLM wrapper)
↓
Memory (Skills / Patterns / MPM / Contracts)
↓
EmbryoCore (self-modulation)
The system is modular and designed for experimentation rather than production deployment.
FAKR is experimental and under active refinement.
It is not a production system.
It is a research-oriented runtime exploring structured AI interaction patterns.
It is still under development.
Multi-model backend (Ollama integration)
↓
Embryo meta-learning (remember why things worked)
↓
Automatic skill recall (use learned skills without prompting)
↓
Structured <think> block parsing for visible reasoning
↓
MPM-based long-term memory consolidation
This project was built with the assistance of AI tools as development accelerator and to better understand processes and reasoning. The goal was not model supremacy, but architectural exploration — understanding how structured runtime layers can augment LLM interaction in a transparent, controllable way. From which I could also understand script separation and project architecture introduction.
Built by [Renato Pedrosa]
Part of a growing ecosystem of personal tools.