๐Ÿ” CVE Alert

CVE-2026-34756

MEDIUM 6.5

vLLM Affected by Unauthenticated OOM Denial of Service via Unbounded `n` Parameter in OpenAI API Server

CVSS Score
6.5
EPSS Score
0.0%
EPSS Percentile
8th

vLLM is an inference and serving engine for large language models (LLMs). From 0.1.0 to before 0.19.0, a Denial of Service vulnerability exists in the vLLM OpenAI-compatible API server. Due to the lack of an upper bound validation on the n parameter in the ChatCompletionRequest and CompletionRequest Pydantic models, an unauthenticated attacker can send a single HTTP request with an astronomically large n value. This completely blocks the Python asyncio event loop and causes immediate Out-Of-Memory crashes by allocating millions of request object copies in the heap before the request even reaches the scheduling queue. This vulnerability is fixed in 0.19.0.

CWE CWE-770
Vendor vllm-project
Product vllm
Published Apr 6, 2026
Last Updated Apr 7, 2026
Stay Ahead of the Next One

Get instant alerts for vllm-project vllm

Be the first to know when new medium vulnerabilities affecting vllm-project vllm are published โ€” delivered to Slack, Telegram or Discord.

Get Free Alerts โ†’ Free ยท No credit card ยท 60 sec setup

CVSS v3 Breakdown

CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:H
Attack Vector
Network
Attack Complexity
Low
Privileges Required
Low
User Interaction
None
Scope
Unchanged
Confidentiality
None
Integrity
None
Availability
High

Affected Versions

vllm-project / vllm
>= 0.1.0, < 0.19.0

References

NVD โ†— CVE.org โ†— EPSS Data โ†—
github.com: https://github.com/vllm-project/vllm/security/advisories/GHSA-3mwp-wvh9-7528 github.com: https://github.com/vllm-project/vllm/pull/37952 github.com: https://github.com/vllm-project/vllm/commit/b111f8a61f100fdca08706f41f29ef3548de7380