Replies: 3 comments 2 replies
-
Hey! Assuming you're using e.g. const { pipeline } = require('node:stream')
const { createWriteStream } = require('node:fs')
const { request } = require('undici')
;(async () => {
const file = createWriteStream('response.json')
const response = await request('https://httpbin.org/get')
pipeline(response.body, file, err => {
if (err) console.log('pipe errored')
else console.log('done')
})
})() If you need something more customizable due to the high size of files that enables you to do backpressure or to handle chunk by chunk of data, you can use Reference |
Beta Was this translation helpful? Give feedback.
-
Thx for the example i will try it. What is the difference between the Also how would i limit the amount of parallel request to not overload the server or get rate-limited? I thought of using a custom http agent and setting the I have like 10k+ files i need to download. Im currently using the |
Beta Was this translation helpful? Give feedback.
-
I have it working now. I just don't know how i can setup a import { request } from 'undici'
import { createWriteStream } from 'node:fs'
import { pipeline } from 'node:stream/promises'
async function download (file) {
const response = await request(file.url)
await pipeline(response.body, createWriteStream(file.destination))
}
// about 10k+ files
const files = [
{ url: 'https://example.com/file.txt', destination: 'files/file.txt' },
{ url: 'https://example.com/image.png', destination: 'files/img/image.png' },
// ... many more files
]
await Promise.allSettled(
files.map((file) => download(file))
)
console.log('done') |
Beta Was this translation helpful? Give feedback.
-
I need to download large files to my disk. How could i stream the response directly to a file?
Beta Was this translation helpful? Give feedback.
All reactions