WLhI-yLBLWlfL8c6.mp4
This project is aimed at producing a fully open-source, LlamaCloud-backed alternative to NotebookLM.
Get the GitHub repository:
git clone https://github.com/run-llama/notebookllamaInstall dependencies:
cd notebookllama/
uv syncModify the .env.example file with your API keys:
OPENAI_API_KEY: find it on OpenAI PlatformELEVENLABS_API_KEY: find it on ElevenLabs SettingsLLAMACLOUD_API_KEY: find it on LlamaCloud Dashboard
Rename the file to .env:
mv .env.example .envNow, you will have to execute the following scripts:
uv run tools/create_llama_extract_agent.py
uv run tools/create_llama_cloud_index.pyAnd you're ready to set up the app!
Launch Postgres and Jaeger:
docker compose up -dRun the MCP server:
uv run src/notebookllama/server.pyNow, launch the Streamlit app:
streamlit run src/notebookllama/Home.pyImportant
You might need to install ffmpeg if you do not have it installed already
And start exploring the app at http://localhost:8751/.
Contribute to this project following the guidelines.
This project is provided under an MIT License.