CVE-2024-34359

Scores

EPSS

0.626medium62.6%
0%20%40%60%80%100%

Percentile: 62.6%

CVSS

9.6critical3.x
0246810

CVSS Score: 9.6/10

All CVSS Scores

CVSS 3.x
9.6

Vector: CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:C/C:H/I:H/A:H

Description

llama-cpp-python is the Python bindings for llama.cpp. llama-cpp-python depends on class Llama in llama.py to load .gguf llama.cpp or Latency Machine Learning Models. The __init__ constructor built in the Llama takes several parameters to configure the loading and running of the model. Other than NUMA, LoRa settings, loading tokenizers, and hardware settings, __init__ also loads the chat template from targeted .gguf ’s Metadata and furtherly parses it to llama_chat_format.Jinja2ChatFormatter.to_chat_handler() to construct the self.chat_handler for this model. Nevertheless, Jinja2ChatFormatter parse the chat template within the Metadate with sandbox-less jinja2.Environment, which is furthermore rendered in __call__ to construct the prompt of interaction. This allows jinja2 Server Side Template Injection which leads to remote code execution by a carefully constructed payload.

Scaner-VS 7 — a modern vulnerability management solution

Uses this database for vulnerability detection. High-speed search, cross-platform, advanced configuration audit, and flexible filtering. Suitable for organizations of any size.
Learn more about Scaner-VS 7

CWEs

CWE-76