A proxy worker for using ollama in cursor.
This is a proxy worker for using ollama in cursor. It is a simple server that forwards requests to the ollama server and returns the response.
When we use llm prediction in cursor editor, the editor sends the data to the official cursor server, and the server sends the data to the ollama server. Therefore, even if the endpoint is set to localhost in the cursor editor configuration, the cursor server cannot communicate with the local server. So, we need a proxy worker that can forward the data to the ollama server.
- Rust 1.75+ (install via
rustup
) - Ollama server
- Cloudflared (optional, for public access tunnel)
cargo install --git https://github.com/lekoOwO/curxy-rs
Or build from source:
git clone https://github.com/lekoOwO/curxy-rs
cd curxy-rs
cargo build --release
-
Launch the ollama server
-
Launch curxy
curxy
If you want to limit access to the ollama server, you can set the
OPENAI_API_KEY
environment variable:OPENAI_API_KEY=your_openai_api_key curxy
If you want to use cloudflared for public access:
curxy --cloudflared-path /path/to/cloudflared
You will see output like this:
Starting server on 127.0.0.1:62192 https://remaining-chen-composition-dressed.trycloudflare.com
-
Enter the URL provided by curxy (with /v1 appended) into the "Override OpenAI Base URL" section of the cursor editor configuration.
-
Add model names you want to use to the "Model Names" section of the cursor editor configuration.
-
(Optional): If you want to restrict access to this Proxy Server for security reasons, you can set the OPENAI_API_KEY environment variable, which will enable access restrictions based on the key.
-
Enjoy!
You can also see help message by running curxy --help
.
Options:
-e, --endpoint <URL> Ollama server endpoint [default: http://localhost:11434]
-o, --openai-endpoint <URL> OpenAI server endpoint [default: https://api.openai.com]
-p, --port <PORT> Server port [default: random]
--hostname <HOSTNAME> Server hostname [default: 127.0.0.1]
--cloudflared-path <PATH> Path to cloudflared executable
-h, --help Show help message
- 🚀 Written in Rust for excellent performance
- 🔒 Optional API key authentication
- 🌐 Support for cloudflared tunneling
- 🔄 Smart request routing
- 🛠 Complete error handling
# Run tests
cargo test
# Run development version
cargo run
# Build release version
cargo build --release
MIT
This project is inspired by curxy.