Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for backpressure indication when using streams #734

Open
paustint opened this issue Jan 24, 2019 · 3 comments
Open

Add support for backpressure indication when using streams #734

paustint opened this issue Jan 24, 2019 · 3 comments
Labels

Comments

@paustint
Copy link

I have a use-case where from my server I am fetching paginated data from a remote server and turning it into a streaming XLSX file download to the user. I might have to fetch data 50+ times to get all the data, and have a queue to manage concurrency.

The problem comes when we are fetching data faster than the user is downloading the data, and we keep pushing more and more data on, which causes memory issues.

This is solved in streams by having an indicator of backpressure where the stream indicates if more data is allowed to be pushed.

Since I have implemented my CSV stream by hand, I can pause the stream and wait as needed, but with this library don't see where I can know if I should be allowed to push more data or not to the stream.

Would it be possible to return the backpressure boolean indicator when a row is committed?

Node Reference: https://nodejs.org/en/docs/guides/backpressuring-in-streams/

@dreamdevil00
Copy link

Any updates?

@alesmenzel
Copy link

This was really frustrating to find out after several debugging sessions. Are there any plans on supporting backpressure? Even just returning true/false from row .commit(...) would fix those issues. As of now, the streaming feature is not usable in production since it will eat up all of your memory in case the readable stream is faster then destination... I am really sad about this to be honest.

@Mike-Dax
Copy link

Mike-Dax commented Nov 6, 2023

For anyone that stumbles upon this, we do the following:

Use a streaming writer, eg:

const writeStream = fs.createWriteStream(filePath)`

const workbook = new stream.xlsx.WorkbookWriter({
    stream: writeStream,
})

After each 'batch' of rows, we manually poll to see if the stream needs to be drained:

const shouldPause = () => {
  if (writeStream.writableNeedDrain) {
    return new Promise<void>((resolve, reject) => {
      writeStream.once('drain', resolve)
    })
  }
  return null
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants