LiteLLM
ghcr.io/berriai/litellm:main-latest
https://github.com/BerriAI/litellm/pkgs/container/litellm
bridge
bash
false
https://github.com/BerriAI/litellm/issues
https://github.com/BerriAI/litellm
LiteLLM provides a proxy server to manage auth, loadbalancing, and spend tracking across 100+ LLMs. All in the OpenAI format.
AI: Tools: Network:Web
http://[IP]:[PORT:4000]/ui
https://raw.githubusercontent.com/Joly0/docker-templates/main/templates/litellm.xml
https://raw.githubusercontent.com/BerriAI/litellm/refs/heads/main/ui/litellm-dashboard/src/app/favicon.ico
--config /app/config.yaml --detailed_debug
Requires the config.yaml to be placed manually at the volume location
Example can be found here:
https://github.com/BerriAI/litellm/blob/main/proxy_server_config.yaml
/mnt/user/appdata/litellm/litellm_config.yaml
4000