-
Notifications
You must be signed in to change notification settings - Fork 0
chore(deps): update dependency litellm to v1.61.15 [security] #146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
|
Review or Edit in CodeSandboxOpen the branch in Web Editor • VS Code • Insiders |
Reviewer's Guide by SourceryThis pull request updates the litellm dependency to version 1.53.1 to address a security vulnerability (CVE-2024-10188). The update includes several fixes and improvements to the LiteLLM proxy, such as enforcing unique key aliases, Datadog integration enhancements, and bug fixes. Sequence diagram for key generation with unique alias enforcementsequenceDiagram
participant User
participant LiteLLM Proxy
participant Database
User->>LiteLLM Proxy: /key/generate request with alias
LiteLLM Proxy->>Database: Check if alias already exists
alt Alias exists
Database-->>LiteLLM Proxy: Alias already in use
LiteLLM Proxy-->>User: Error: Alias already exists
else Alias is unique
Database-->>LiteLLM Proxy: Alias is unique
LiteLLM Proxy->>Database: Store new key with alias
Database-->>LiteLLM Proxy: Key stored successfully
LiteLLM Proxy-->>User: Key generated successfully
end
Sequence diagram for key update with unique alias enforcementsequenceDiagram
participant User
participant LiteLLM Proxy
participant Database
User->>LiteLLM Proxy: /key/update request with alias
LiteLLM Proxy->>Database: Check if alias already exists
alt Alias exists and is not owned by the user
Database-->>LiteLLM Proxy: Alias already in use by another user
LiteLLM Proxy-->>User: Error: Alias already exists
else Alias is unique or owned by the user
Database-->>LiteLLM Proxy: Alias is unique or owned by the user
LiteLLM Proxy->>Database: Update key with alias
Database-->>LiteLLM Proxy: Key updated successfully
LiteLLM Proxy-->>User: Key updated successfully
end
Sequence diagram for Datadog logging with StandardLoggingPayloadsequenceDiagram
participant LiteLLM Proxy
participant LLM Provider
participant Datadog
LiteLLM Proxy->>LLM Provider: LLM Request
LLM Provider-->>LiteLLM Proxy: LLM Response
LiteLLM Proxy->>LiteLLM Proxy: Create StandardLoggingPayload
LiteLLM Proxy->>Datadog: Log StandardLoggingPayload
alt LLM Failure
LiteLLM Proxy->>Datadog: Log LLM Failure
end
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have skipped reviewing this pull request. Here's why:
- It seems to have been created by a bot (hey, renovate[bot]!). We assume it knows what it's doing!
- We don't review packaging changes - Let us know if you'd like us to change this.
CI Feedback 🧐(Feedback updated until commit b3422ca)A test triggered by this PR failed. Here is an AI-generated analysis of the failure:
|
cf296ff to
7a10b95
Compare
7a10b95 to
b3422ca
Compare
b3422ca to
577c360
Compare
This PR contains the following updates:
==1.49.0->==1.61.15GitHub Vulnerability Alerts
CVE-2024-10188
A vulnerability in BerriAI/litellm, as of commit 26c03c9, allows unauthenticated users to cause a Denial of Service (DoS) by exploiting the use of ast.literal_eval to parse user input. This function is not safe and is prone to DoS attacks, which can crash the litellm Python server.
CVE-2025-0628
An improper authorization vulnerability exists in the main-latest version of BerriAI/litellm. When a user with the role 'internal_user_viewer' logs into the application, they are provided with an overly privileged API key. This key can be used to access all the admin functionality of the application, including endpoints such as '/users/list' and '/users/get_users'. This vulnerability allows for privilege escalation within the application, enabling any account to become a PROXY ADMIN.
CVE-2024-8984
A Denial of Service (DoS) vulnerability exists in berriai/litellm version v1.44.5. This vulnerability can be exploited by appending characters, such as dashes (-), to the end of a multipart boundary in an HTTP request. The server continuously processes each character, leading to excessive resource consumption and rendering the service unavailable. The issue is unauthenticated and does not require any user interaction, impacting all users of the service.
Release Notes
BerriAI/litellm (litellm)
v1.61.7What's Changed
return_citationsdocumentation by @miraclebakelaser in #8527/bedrock/meta.llama3-3-70b-instruct-v1:0tool calling support + cost tracking + base llm unit test for tool calling by @ishaan-jaff in #8545/completionsroute by @ishaan-jaff in #8551x-litellm-attempted-fallbacksin responses from litellm proxy by @ishaan-jaff in #8558New Contributors
Full Changelog: BerriAI/litellm@v1.61.3...v1.61.7
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.3What's Changed
/modelsand/model_group/infoby @krrishdholakia in #8473include_usagefor /completions requests + unit testing by @ishaan-jaff in #8484PerplexityChatConfig- track correct OpenAI compatible params by @ishaan-jaff in #8496-nightlyby @krrishdholakia in #8499gemini-2.0-pro-exp-02-05vertex ai model to cost map + Newbedrock/deepseek_r1/*route by @krrishdholakia in #8525Full Changelog: BerriAI/litellm@v1.61.1...v1.61.3
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.1Compare Source
What's Changed
Full Changelog: BerriAI/litellm@v1.61.0...v1.61.1
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.61.0What's Changed
/bedrock/invoke/by @ishaan-jaff in #8397/team/updates in multi-instance deployments with Redis by @ishaan-jaff in #8440New Contributors
Full Changelog: BerriAI/litellm@v1.60.8...v1.61.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.8What's Changed
/cache/ping+ add timeout value and elapsed time on azure + http calls by @krrishdholakia in #8377/bedrock/invokesupport for all Anthropic models by @ishaan-jaff in #8383Full Changelog: BerriAI/litellm@v1.60.6...v1.60.8
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.6Compare Source
What's Changed
choices=[]by @ishaan-jaff in #8339choices=[]on llm responses by @ishaan-jaff in #8342New Contributors
Full Changelog: BerriAI/litellm@v1.60.5...v1.60.6
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.5Compare Source
What's Changed
BaseLLMHTTPHandlerclass by @ishaan-jaff in #8290New Contributors
Full Changelog: BerriAI/litellm@v1.60.4...v1.60.5
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.4Compare Source
What's Changed
bedrock/novamodels + add utillitellm.supports_tool_choiceby @ishaan-jaff in #8264rolebased access to proxy by @krrishdholakia in #8260Full Changelog: BerriAI/litellm@v1.60.2...v1.60.4
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.2Compare Source
What's Changed
sso_user_idto LiteLLM_UserTable by @krrishdholakia in #8167/vertex_ai/was not detected as llm_api_route on pass through butvertex-aiwas by @ishaan-jaff in #8186modeas list, fix valid keys error in pydantic, add more testing by @krrishdholakia in #8224New Contributors
Full Changelog: BerriAI/litellm@v1.60.0...v1.60.2
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.60.0What's Changed
Important Changes between v1.50.xx to 1.60.0
def async_log_stream_eventanddef log_stream_eventno longer supported forCustomLoggershttps://docs.litellm.ai/docs/observability/custom_callback. If you want to log stream events usedef async_log_success_eventanddef log_success_eventfor logging success stream eventsKnown Issues
🚨 Detected issue with Langfuse Logging when Langfuse credentials are stored in DB
bedrockmodels + showend_userby @ishaan-jaff in #8118keyTeam.team_alias === "Default Team"by @ishaan-jaff in #8122LoggingCallbackManagerto append callbacks and ensure no duplicate callbacks are added by @ishaan-jaff in #8112litellm.disable_no_log_paramparam by @krrishdholakia in #8134litellm.turn_off_message_logging=Trueby @ishaan-jaff in #8156New Contributors
Full Changelog: BerriAI/litellm@v1.59.10...v1.60.0
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.10Compare Source
What's Changed
modelparam by @ishaan-jaff in #8105bedrock/converse_like/<model>route by @krrishdholakia in #8102Full Changelog: BerriAI/litellm@v1.59.9...v1.59.10
Docker Run LiteLLM Proxy
Don't want to maintain your internal proxy? get in touch 🎉
Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat
Load Test LiteLLM Proxy Results
v1.59.9Compare Source
What's Changed
metadataparam preview support + newx-litellm-timeoutrequest header by @krrishdholakia in #8047New Contributors
Full Changelog: BerriAI/litellm@v1.59.8...v1.59.9
Docker Run LiteLLM Proxy
Don't want to maintain
Configuration
📅 Schedule: Branch creation - "" (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Enabled.
♻ Rebasing: Whenever PR is behind base branch, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.