-
Notifications
You must be signed in to change notification settings - Fork 520
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Duplicated values in Sentry Baggage header cause eventual 431 "Request header fields too large" error on HTTP fetch #3709
Comments
@jmo1012 thanks for the report.
Never mind, I see you linked it already earlier in the issue, sorry! |
@jmo1012 Would you be able to provide a full minimal reproduction? I tried with the following code (based on the snippet you provided), but the baggage did not grow over time: from httpx import AsyncClient
import asyncio
import sentry_sdk
from fastapi import FastAPI
import multiprocessing
import uvicorn
MY_URL = "http://localhost:8000/status"
MY_BODY = '{"job_id": "123"}'
def is_successful_response(response) -> bool:
# Always return True to indicate it's still processing
return True
def run_server():
app = FastAPI()
@app.get("/status")
async def status():
return {"status": "processing"}
uvicorn.run(app, host="127.0.0.1", port=8000)
async def poll_until_complete():
sentry_sdk.init(dsn=..., traces_sample_rate=1.0) # replace with your DSN
await asyncio.sleep(1.0)
is_still_processing = True
with sentry_sdk.start_transaction(name="poll_until_complete"):
while is_still_processing:
async with AsyncClient() as http_client:
http_request = http_client.build_request(
method="GET",
url=MY_URL,
content=MY_BODY,
headers={"content-type": "application/json"},
)
http_response = await http_client.send(request=http_request)
http_response.raise_for_status()
is_still_processing = is_successful_response(http_response)
if not is_still_processing:
return http_response.content
await asyncio.sleep(0.1)
if __name__ == "__main__":
# Start mock server in a child process
server_process = multiprocessing.Process(target=run_server)
server_process.start()
asyncio.run(poll_until_complete()) |
@szokeasaurusrex - I am working on this for you. I will get back to you as soon as I build one. |
@szokeasaurusrex - Thank you for your patience. I was able to create a working reproduction script. I didn't realize that the key to this issue was the placement of from httpx import AsyncClient
import asyncio
import sentry_sdk
from fastapi import FastAPI
import multiprocessing
import uvicorn
MY_URL = "https://catfact.ninja/fact?max_length=140"
def is_successful_response(response) -> bool:
# Always return True to indicate it's still processing
return True
def run_server():
app = FastAPI()
@app.get("/status")
async def status():
return {"status": "processing"}
uvicorn.run(app, host="127.0.0.1", port=8000)
async def poll_until_complete():
sentry_sdk.init(dsn=..., traces_sample_rate=1.0) # replace with your DSN
await asyncio.sleep(1.0)
is_still_processing = True
with sentry_sdk.start_transaction(name="poll_until_complete"):
async with AsyncClient() as http_client:
http_request = http_client.build_request(
method="GET",
url=MY_URL,
headers={"content-type": "application/json"},
)
while is_still_processing:
http_response = await http_client.send(request=http_request)
http_response.raise_for_status()
is_still_processing = is_successful_response(http_response)
if not is_still_processing:
return http_response.content
await asyncio.sleep(0.1)
if __name__ == "__main__":
# Start mock server in a child process
server_process = multiprocessing.Process(target=run_server)
server_process.start()
asyncio.run(poll_until_complete()) For reference (if it is helpful) this is the requirements I used
|
Thanks @jmo1012! I will look at this next week (tomorrow is a holiday in Austria) |
Happy All Saints' Day! |
@jmo1012 Thank you for the reproduction; I also observed the baggage growing with each request when using your code. The problem is not actually with the http_request = http_client.build_request(
method="GET",
url=MY_URL,
headers={"content-type": "application/json"},
)
while is_still_processing:
http_response = await http_client.send(request=http_request) # same request each iteration
... If you put the while is_still_processing:
http_request = http_client.build_request(
method="GET",
url=MY_URL,
headers={"content-type": "application/json"},
)
http_response = await http_client.send(request=http_request) # new request each iteration You are observing this behavior because we always append the Sentry baggage to any existing baggage on the request, so that we don't overwrite any existing baggage set by the user. Likely, we should instead parse the baggage and overwrite the Sentry baggage while keeping any other baggage on the request |
Understood. Thank you! |
Sentry baggage will get added to an HTTPX request multiple times if the same request is repeated. To prevent this from occurring, we can strip any existing Sentry baggage before adding Sentry baggage to the request. Fixes #3709
Sentry baggage will get added to an HTTPX request multiple times if the same request is repeated. To prevent this from occurring, we can strip any existing Sentry baggage before adding Sentry baggage to the request. Fixes #3709 --------- Co-authored-by: Ivana Kellyer <ivana.kellyer@sentry.io> Co-authored-by: Anton Pirker <anton.pirker@sentry.io>
Issue:
While polling an external API endpoint, the
baggage
header grows each iteration of the poll withsentry-trace_id
,sentry-environment
, andsentry-release
repeating in each loop. After a while (if the poll does not resolve) the header eventually grows so big it errors with 431 "Request header fields too large" error on HTTP fetch.Similar Issues
SDK Version:
sentry-sdk==2.13.0
Example:
In the above example, the
baggage
header in the request grows with each iteration.Note
The
trace_propagation_targets
is available to bypass the issue, but it doesn't fix the root cause.The javascript lib issue was fixed by using .set() instead of .append(), so the Python lib might be resolved in a similar manner?
The text was updated successfully, but these errors were encountered: