- Core Architecture and Licensing Clarity
- Getting started
- Advanced configuration
- Documentation
- Contributing
- License
- Contacts
Fred is both:
- An innovation lab — to help developers rapidly explore agentic patterns, domain-specific logic, and custom tools.
- A production-ready platform — already integrated with real enterprise constraints: auth, security, document lifecycle, and deployment best practices.
It is composed of:
- a Python agentic backend (FastAPI + LangGraph)
- a Python knowledge flow backend (FastAPI) for document ingestion and vector search
- a React frontend
Fred is not a framework, but a full reference implementation that shows how to build practical multi-agent applications with LangChain and LangGraph. Agents cooperate to answer technical, context-aware questions.
See the project site: https://fredk8.dev
The three components just described form the entirety of the Fred platform. They are self-contained and do not require any external dependencies such as MinIO, OpenSearch, or Weaviate.
Instead, Fred is designed with a modular architecture that allows optional integration with these technologies. By default, a minimal Fred deployment can use just the local filesystem for all storage needs.
Fred is released under the Apache License 2.0. It does *not embed or depend on any LGPLv3 or copyleft-licensed components. Optional integrations (like OpenSearch or Weaviate) are configured externally and do not contaminate Fred's licensing. This ensures maximum freedom and clarity for commercial and internal use.
In short: Fred is 100% Apache 2.0, and you stay in full control of any additional components.
Fred works out of the box when you provide one secret — your OpenAI API key.
Defaults:
- Keycloak is bypassed by a mock
admin/admin
user - All data (metrics, conversations, uploads) is stored on the local filesystem
- No external services are required
Production services and databases can be added later or via the deployment factory repository.
Tool | Version | Install hint |
---|---|---|
Python | 3.12.8 | pyenv install 3.12.8 |
Node | 22.13.0 | nvm install 22.13.0 |
Make | any | install from your OS |
git clone https://github.com/ThalesGroup/fred.git
cd fred
echo "OPENAI_API_KEY=sk-..." > {agentic_backend,knowledge_flow_backend}/config/.env
# Terminal 1 – agentic backend
cd agentic_backend && make run
# Terminal 2 – knowledge flow backend
cd knowledge_flow_backend && make run
# Terminal 3 – frontend
cd frontend && make run
Open http://localhost:5173 in your browser.
To get full VS Code Python support (linting, IntelliSense, debugging, etc.) across our repo, we provide:
- A VS Code workspace file
fred.code-workspace
that loads all sub‑projects. - Per‑folder
.vscode/settings.json
files in each Python backend to pin the interpreter.
- Visual Studio Code
- VS Code extensions:
- Python (ms-python.python)
- Pylance (ms-python.vscode-pylance)
- Open the workspace
After cloning the repo, you can open Fred's VS Code workspace with code fred.code-workspace
When you open Fred's VS Code workspace, VS Code will load four folders:
- fred – for any repo‑wide scripts
- agentic_backend – first Python backend
- knowledge_flow_backend – second Python backend
- frontend – UI
- Per‑folder Python interpreters
Each backend ships its own virtual environment under .venv. We’ve added a per‑folder VS Code setting (see for instance
agentic_backend/.vscode/settings.json
) to automatically pick it:
This ensures that as soon as you open a Python file under agentic_backend/ (or knowledge_flow_backend/), VS Code will:
- Activate that folder’s virtual environment
- Provide linting, IntelliSense, formatting, and debugging using the correct Python
If you prefer a fully containerised IDE with all dependencies running:
- Install Docker, VS Code (or an equivalent IDE that supports Dev Containers), and the Dev Containers extension.
- Create
~/.fred/openai-api-key.env
containingOPENAI_API_KEY=sk-…
. - In VS Code, press F1 → Dev Containers: Reopen in Container.
The Dev Container starts the devcontainer
service plus Postgres, OpenSearch, and MinIO. Ports 8000 (backend) and 5173 (frontend) are forwarded automatically.
Inside the container, start the servers:
# Terminal 1 – agentic backend
cd agentic_backend && make run
# Terminal 2 – knowledge flow backend
cd knowledge_flow_backend && make run
# Terminal 3 – frontend
cd frontend && make run
Provider | How to enable |
---|---|
OpenAI (default) | Add OPENAI_API_KEY to config/.env |
Azure OpenAI | Add AZURE_OPENAI_API_KEY and endpoint variables; adjust configuration.yaml |
Ollama (local models) | Set OLLAMA_BASE_URL and model name in configuration.yaml |
See agentic_backend/config/configuration.yaml
(section ai:
) for concrete examples.
File | Purpose | Tip |
---|---|---|
agentic_backend/config/.env |
Secrets (API keys, passwords). Not committed to Git. | Copy .env.template to .env and then fill in any missing values. |
knowledge_flow_backend/config/.env |
Same as above | Same as above |
agentic_backend/config/configuration.yaml |
Functional settings (providers, agents, feature flags). | - |
knowledge_flow_backend/config/configuration.yaml |
Same as above | - |
Component | Location | Role |
---|---|---|
Frontend UI | ./frontend |
React-based chatbot |
Agentic backend | ./agentic_backend |
Multi-agent API server |
Knowledge Flow backend | ./knowledge_flow_backend |
Optional knowledge management component (document ingestion & Co) |
- Enable Keycloak or another OIDC provider for authentication
- Persist metrics and files in OpenSearch and MinIO
- Main docs: https://fredk8.dev/docs
- Agentic backend README
- Frontend README
- Knowledge Flow backend README
- Code of Conduct
- License
- Security
- Python Coding Guide
- Contributing
We welcome pull requests and issues. Start with the Contributing guide.
Apache 2.0 — see LICENSE