You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, we tested a bit around with the bulk_items endpoint and made the experience that we ran into timeouts (more than 30s) for even smaller bulks (e.g. 100 items, one item of 30kb size roughly).
Do you @bitner@vincentsarago or anyone have some experience with the performance or do you have some benchmarks made already to share with us?
Basically we used snippet like the test on our service:
Pypgstac load items will definitely be faster for bulk loads.
It has some logic in there that reduces the amount of time that any locks are held in the database
It offloads some of the processing that is required to format and dehydrate (a form of compression) data into the format that is stored in the database to the client.
There is not double-hop over the network. All data is directly sent to the database rather than being brokered through an api server.
Hi there, we tested a bit around with the bulk_items endpoint and made the experience that we ran into timeouts (more than 30s) for even smaller bulks (e.g. 100 items, one item of 30kb size roughly).
Do you @bitner @vincentsarago or anyone have some experience with the performance or do you have some benchmarks made already to share with us?
Basically we used snippet like the test on our service:
stac-fastapi-pgstac/tests/clients/test_postgres.py
Line 328 in 21aae32
What do you recommend to use for bulk inserts, the endpoint or e.g. using tools like pygstac load items?
Thanks in advance!
The text was updated successfully, but these errors were encountered: