A Framework for AI Agents by Akrasia Labs
Liz is a lightweight framework for building AI agents, reimagined for developers who demand power and simplicity. Inspired by Eliza from AI16Z, but rebuilt from the ground up with a focus on developer experience and control.
- Direct LLM Control: Full access to prompts and model interactions
- Zero Magic: Minimal abstractions for maximum understanding
- Ultimate Flexibility: Build exactly what you need, how you need it
-
Clone and Install
git clone <your-repo> cd liz pnpm install
-
Configure Environment
cp .env.example .env
Required environment variables:
# Database - Choose one: DATABASE_URL="postgresql://user:password@localhost:5432/dbname" # or for SQLite: DATABASE_URL="file:./prisma/dev.db" # LLM APIs OPENAI_API_KEY="your-openai-key" OPENROUTER_API_KEY="your-openrouter-key" # Application APP_URL="http://localhost:3000"
-
Initialize Database
npm run init-db
-
Start Development
npm run dev
We use Express-style middleware for a clear, linear processing flow:
// Example middleware setup
const framework = new AgentFramework();
// Add standard middleware
framework.use(validateInput);
framework.use(loadMemories);
framework.use(wrapContext);
framework.use(createMemoryFromInput);
framework.use(router);
const myAgent: Character = {
name: "Assistant",
agentId: "unique_id",
system: "You are a helpful assistant.",
bio: ["Your agent's backstory"],
lore: ["Additional background"],
messageExamples: [], // Example conversations
postExamples: [], // Example social posts
topics: ["expertise1", "expertise2"],
style: {
all: ["consistent", "helpful"],
chat: ["conversational"],
post: ["engaging"],
},
adjectives: ["friendly", "knowledgeable"],
};
const agent = new BaseAgent(myAgent);
agent.addRoute({
name: "conversation",
description: "Handle natural conversation",
handler: async (context, req, res) => {
const response = await llmUtils.getTextFromLLM(
context,
"anthropic/claude-3-sonnet"
);
await res.send(response);
},
});
const twitter = new TwitterClient(agent, {
username: process.env.TWITTER_USERNAME,
password: process.env.TWITTER_PASSWORD,
email: process.env.TWITTER_EMAIL,
retryLimit: 3,
postIntervalHours: 4,
pollingInterval: 5, // minutes
dryRun: false,
});
await twitter.start();
The memory system uses Prisma with SQLite (or PostgreSQL) to maintain conversation context:
interface Memory {
id: string;
userId: string;
agentId: string;
roomId: string;
content: any;
type: string;
generator: string; // "external" or "llm"
createdAt: Date;
}
Key features:
- Automatic context loading for each request
- Memory creation for both user inputs and agent responses
- Indexed by room, user, and agent IDs
- Configurable memory limits and types
Supports multiple LLM providers through a unified interface:
const llmUtils = new LLMUtils();
// Text generation
const response = await llmUtils.getTextFromLLM(
prompt,
"anthropic/claude-3-sonnet"
);
// Structured output
const result = await llmUtils.getObjectFromLLM(prompt, schema, LLMSize.LARGE);
// Image analysis
const description = await llmUtils.getImageDescriptions(imageUrls);
- Automated Posting: Configurable intervals for regular content
- Mention Monitoring: Real-time interaction handling
- Thread Management: Automatic thread building and response chaining
- Rate Limiting: Built-in rate limiting and retry mechanisms
- Memory Integration: Conversational context across interactions
# Build and run with Docker Compose
docker-compose up --build
Environment configuration in docker-compose.yml:
services:
app:
build: .
ports:
- "3000:3000"
environment:
- TEE_MODE=DOCKER
- OPENAI_API_KEY=${OPENAI_API_KEY}
- OPENROUTER_API_KEY=${OPENROUTER_API_KEY}
- TWITTER_USERNAME=${TWITTER_USERNAME}
- TWITTER_PASSWORD=${TWITTER_PASSWORD}
- TWITTER_EMAIL=${TWITTER_EMAIL}
volumes:
- ./prisma/dev.db:/app/prisma/dev.db
src/
├── middleware/ # Pipeline steps
│ ├── validate-input.ts # Input validation
│ ├── load-memories.ts # Context loading
│ ├── wrap-context.ts # Request wrapping
│ ├── create-memory.ts # Memory creation
│ └── router.ts # Route handling
├── agent/ # Core agent logic
├── framework/ # Express-style system
├── types/ # TypeScript definitions
├── utils/ # Helper functions
│ ├── llm.ts # LLM interactions
│ ├── memory.ts # Memory management
│ ├── db.ts # Database utilities
│ └── initDb.ts # DB initialization
└── example/ # Implementation examples
clients/
└── twitter/ # Twitter integration
├── client.js # Main client class
├── base.js # Core functionality
└── utils.js # Helper functions
# Build the project
npm run build
# Start production
npm start
# Development with auto-reload
npm run dev
# Test Twitter integration
npm run twitter
# Database management
npm run db:init # Initialize database
npm run db:reset # Reset database
npm run prisma:studio # Database UI
We believe the best way to build AI agents is to work closely with the prompts and build a set of composable units that can be strung together to make powerful agentic loops. Our approach is informed by Anthropic's research on constructing reliable AI systems.
While Liz is meant to be forked and modified, we welcome contributions to the base template:
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Open a Pull Request
MIT
Visit akrasia.ai/liz to learn more.