CVE-2024-32878

low-risk
Published 2024-04-26

Llama.cpp is LLM inference in C/C++. There is a use of uninitialized heap variable vulnerability in gguf_init_from_file, the code will free this uninitialized variable later. In a simple POC, it will directly cause a crash. If the file is carefully constructed, it may be possible to control this uninitialized value and cause arbitrary address free problems. This may further lead to be exploited. Causes llama.cpp to crash (DoS) and may even lead to arbitrary code execution (RCE). This vulnerability has been patched in commit b2740.

Do I need to act?

-
0.21% chance of exploitation
EPSS score — low exploit probability
-
Not on CISA KEV list
No confirmed active exploitation reported to CISA
?
Patch status unknown
Check vendor advisories for fix availability and mitigation guidance
7
CVSS 7.1/10 High
NETWORK / HIGH complexity

Affected Products (1)

Llama.Cpp

Affected Vendors

27
/ 100
low-risk
Severity 21/34 · High
Exploitability 1/34 · Minimal
Exposure 5/34 · Minimal