CVE-2026-22778 | Teknoloji dünyasından en güncel haberleri ve güvenlikle ilgili gelişmeleri takip edin.

vLLM is an inference and serving engine for large language models (LLMs). From 0.8.3 to before 0.14.1, when an invalid image is sent to vLLM's multimodal endpoi…
Critical CVSS: 9.8

CVE-2026-22778

vLLM is an inference and serving engine for large language models (LLMs). From 0.8.3 to before 0.14.1, when an invalid image is sent to vLLM's multimodal endpoint, PIL throws an error. vLLM returns this error to the client, leaking a heap address. With this leak, we reduce ASLR from 4 billion guesses to ~8 guesses. This vulnerability can be chained a heap overflow with JPEG2000 decoder in OpenCV/FFmpeg to achieve remote code execution. This vulnerability is fixed in 0.14.1.
Vendor
Vllm
Product
Vllm
CWE
CWE-532
Yayın Tarihi
2026-02-02 23:16:06
Güncelleme
2026-02-23 18:19:12
Source Identifier
security-advisories@github.com
KEV Date Added
-

Kategoriler

Referanslar