-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for backpressure indication when using streams #734
Comments
Any updates? |
This was really frustrating to find out after several debugging sessions. Are there any plans on supporting backpressure? Even just returning true/false from row |
For anyone that stumbles upon this, we do the following: Use a streaming writer, eg: const writeStream = fs.createWriteStream(filePath)`
const workbook = new stream.xlsx.WorkbookWriter({
stream: writeStream,
}) After each 'batch' of rows, we manually poll to see if the stream needs to be drained: const shouldPause = () => {
if (writeStream.writableNeedDrain) {
return new Promise<void>((resolve, reject) => {
writeStream.once('drain', resolve)
})
}
return null
} |
I have a use-case where from my server I am fetching paginated data from a remote server and turning it into a streaming XLSX file download to the user. I might have to fetch data 50+ times to get all the data, and have a queue to manage concurrency.
The problem comes when we are fetching data faster than the user is downloading the data, and we keep pushing more and more data on, which causes memory issues.
This is solved in streams by having an indicator of backpressure where the stream indicates if more data is allowed to be pushed.
Since I have implemented my CSV stream by hand, I can pause the stream and wait as needed, but with this library don't see where I can know if I should be allowed to push more data or not to the stream.
Would it be possible to return the backpressure boolean indicator when a row is committed?
Node Reference: https://nodejs.org/en/docs/guides/backpressuring-in-streams/
The text was updated successfully, but these errors were encountered: