๐Ÿ” CVE Alert

CVE-2026-42203

UNKNOWN 0.0

LiteLLM: Server-Side Template Injection in /prompts/test endpoint

CVSS Score
0.0
EPSS Score
0.0%
EPSS Percentile
0th

LiteLLM is a proxy server (AI Gateway) to call LLM APIs in OpenAI (or native) format. From version 1.80.5 to before version 1.83.7, the POST /prompts/test endpoint accepted user-supplied prompt templates and rendered them without sandboxing. A crafted template could run arbitrary code inside the LiteLLM Proxy process. The endpoint only checks that the caller presents a valid proxy API key, so any authenticated user could reach it. Depending on how the proxy is deployed, this could expose secrets in the process environment (such as provider API keys or database credentials) and allow commands to be run on the host. This issue has been patched in version 1.83.7.

CWE CWE-1336
Vendor berriai
Product litellm
Published May 8, 2026
Last Updated May 8, 2026
Stay Ahead of the Next One

Get instant alerts for berriai litellm

Be the first to know when new unknown vulnerabilities affecting berriai litellm are published โ€” delivered to Slack, Telegram or Discord.

Get Free Alerts โ†’ Free ยท No credit card ยท 60 sec setup

Affected Versions

BerriAI / litellm
>= 1.80.5, < 1.83.7

References

NVD โ†— CVE.org โ†— EPSS Data โ†—
github.com: https://github.com/BerriAI/litellm/security/advisories/GHSA-xqmj-j6mv-4862 github.com: https://github.com/BerriAI/litellm/releases/tag/v1.83.7-stable