Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): bump litellm[proxy] from 1.35.28 to 1.37.16 in /backend #2403

Closed

Conversation

dependabot[bot]
Copy link

@dependabot dependabot bot commented on behalf of github May 20, 2024

Bumps litellm[proxy] from 1.35.28 to 1.37.16.

Release notes

Sourced from litellm[proxy]'s releases.

v1.37.16

What's Changed

Full Changelog: BerriAI/litellm@v1.37.14...v1.37.16

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.37.16

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 9 10.2880556709407 1.5629325106711098 1.5629325106711098 468 468 7.436624999968444 83.99098699999286
/health/liveliness Failed ❌ 8 10.80103857402248 15.632664706092875 15.632664706092875 4681 4681 6.298579000031168 1272.475381999982
/health/readiness Failed ❌ 8 10.780497224867714 15.712815091255495 15.712815091255495 4705 4705 6.286180000017794 650.4576310000232
Aggregated Failed ❌ 8 10.766867369799249 32.90841230801948 32.90841230801948 9854 9854 6.286180000017794 1272.475381999982

v1.37.14

What's Changed

Full Changelog: BerriAI/litellm@v1.37.13...v1.37.14

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
</tr></table> 

... (truncated)

Commits
  • 8d25a7b Merge pull request #3715 from BerriAI/litellm_model_id_fix
  • 5e5179e Merge branch 'main' into litellm_model_id_fix
  • 5d3fe52 test: fix test
  • 1cecdc4 fix(utils.py): fix replicate completion cost calculation
  • c8a1cf6 (ci/cd) run again
  • 7af7610 fix - test num callbacks
  • a75b865 test(test_config.py): fix test
  • 25920a7 bump: version 1.37.15 → 1.37.16
  • 6708a1a ui - new build
  • 60b9bc2 Merge pull request #3714 from BerriAI/litellm_show_max_input_tokens_ui
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [litellm[proxy]](https://github.com/BerriAI/litellm) from 1.35.28 to 1.37.16.
- [Release notes](https://github.com/BerriAI/litellm/releases)
- [Commits](BerriAI/litellm@v1.35.28...v1.37.16)

---
updated-dependencies:
- dependency-name: litellm[proxy]
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels May 20, 2024
@tjbck tjbck closed this May 22, 2024
Copy link
Author

dependabot bot commented on behalf of github May 22, 2024

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@dependabot dependabot bot deleted the dependabot/pip/backend/litellm-proxy--1.37.16 branch May 22, 2024 04:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant