Subject
When receiving a large brotli-compressed response with chunked transfer encoding, urllib3 2.6.x raises a DecodeError with the message:
brotli: decoder process called with data when 'can_accept_more_data()' is False
This appears to be a regression introduced in urllib3 2.6.0, likely related to the security changes for handling compressed content (GHSA-2xpw-w6gg-jr37). The issue occurs specifically when:
- The response uses
Content-Encoding: br (brotli)
- The response uses
Transfer-Encoding: chunked
- The compressed data is moderately large (~500KB+ compressed)
- Data arrives in small TCP segments
The issue does not occur with urllib3 2.5.0.
Environment
OS: Linux 6.x (Debian Bookworm in Docker)
Python: 3.12.12
OpenSSL: OpenSSL 3.0.x
urllib3: 2.6.1
brotli: 1.2.0
requests: 2.31.0
Works correctly with:
urllib3: 2.5.0
brotli: 1.1.0
Steps to Reproduce
Minimal reproduction script:
#!/usr/bin/env python3
"""Minimal reproduction: brotli decode bug with urllib3 2.6.x + brotli 1.2.0"""
import hashlib
import socket
import threading
import brotli
import requests
def main() -> int:
from importlib.metadata import version
print(f"urllib3: {version('urllib3')}, brotli: {version('brotli')}")
# Generate ~15MB data with moderate compressibility (~27x ratio)
data = b"".join(
f"{hashlib.sha256(str(i).encode()).hexdigest()}{'a' * 900}{i:06d}\n".encode()
for i in range(15000)
)
compressed = brotli.compress(data)
print(f"Data: {len(data):,} -> {len(compressed):,} bytes ({len(data) // len(compressed)}x)")
# Build chunked HTTP response
resp = b"HTTP/1.1 200 OK\r\nContent-Encoding: br\r\nTransfer-Encoding: chunked\r\n\r\n"
for i in range(0, len(compressed), 32768):
chunk = compressed[i : i + 32768]
resp += f"{len(chunk):x}\r\n".encode() + chunk + b"\r\n"
resp += b"0\r\n\r\n"
# Start mock server
ready = threading.Event()
def serve(port: int) -> None:
s = socket.socket()
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(("127.0.0.1", port))
s.listen(1)
ready.set()
c, _ = s.accept()
c.recv(4096)
for i in range(0, len(resp), 128): # Small chunks trigger bug
c.send(resp[i : i + 128])
c.close()
s.close()
threading.Thread(target=serve, args=(18765,), daemon=True).start()
ready.wait()
try:
r = requests.get("http://127.0.0.1:18765/", timeout=60)
print(f"SUCCESS: {len(r.content):,} bytes")
return 0
except requests.exceptions.ContentDecodingError as e:
print(f"FAILED: {e}")
return 1
if __name__ == "__main__":
exit(main())
Expected Behavior
The response should be successfully decompressed and returned:
urllib3: 2.5.0, brotli: 1.1.0
Data: 14,565,000 -> 549,724 bytes (26x)
SUCCESS: 14,565,000 bytes
Actual Behavior
With urllib3 2.6.x, the request fails with a ContentDecodingError:
urllib3: 2.6.1, brotli: 1.2.0
Data: 14,565,000 -> 549,724 bytes (26x)
FAILED: ('Received response with content-encoding: br, but failed to decode it.',
error("brotli: decoder process called with data when 'can_accept_more_data()' is False"))
Full traceback:
urllib3.exceptions.DecodeError: ('Received response with content-encoding: br, but
failed to decode it.', error("brotli: decoder process called with data when
'can_accept_more_data()' is False"))
The error suggests that the brotli decoder signals completion (can_accept_more_data() returns False) but urllib3 continues trying to feed it more compressed data from the chunked stream.
Subject
When receiving a large brotli-compressed response with chunked transfer encoding, urllib3 2.6.x raises a
DecodeErrorwith the message:This appears to be a regression introduced in urllib3 2.6.0, likely related to the security changes for handling compressed content (GHSA-2xpw-w6gg-jr37). The issue occurs specifically when:
Content-Encoding: br(brotli)Transfer-Encoding: chunkedThe issue does not occur with urllib3 2.5.0.
Environment
Works correctly with:
Steps to Reproduce
Minimal reproduction script:
Expected Behavior
The response should be successfully decompressed and returned:
Actual Behavior
With urllib3 2.6.x, the request fails with a
ContentDecodingError:Full traceback:
The error suggests that the brotli decoder signals completion (
can_accept_more_data()returnsFalse) but urllib3 continues trying to feed it more compressed data from the chunked stream.