Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming compatibility with archiver #671

Closed
danhaller opened this issue Jan 6, 2017 · 14 comments
Closed

Streaming compatibility with archiver #671

danhaller opened this issue Jan 6, 2017 · 14 comments
Labels
Milestone

Comments

@danhaller
Copy link

Hi,

I'm trying to use sharp's streaming capability in tandem with archiver to resize and zip multiple images and stream out to a consumer. I can get each library to work on their own, but not together on any more than a single image.

Here's some example code:

const fs = require('fs');
const archiver = require('archiver');
const sharp = require('sharp');
//const imagemagick = require('imagemagick-stream');
const request = require('request');

function resizeAndZip() {
  const imageStreams = [
    request('http://i.stack.imgur.com/tKsDb.png'),
    request('http://i.stack.imgur.com/EdUwb.png'),
    request('http://i.stack.imgur.com/5d55j.png')
  ];

  const zip = archiver('zip');

  zip.on('entry', (entry) => console.log(`${entry.name} successfully appended to zip`));

  imageStreams.map((stream) => {
    const resize = sharp().resize(100, 100);
    // const resize = imagemagick().resize('100x100');

    return stream.pipe(resize);
  })
  .forEach((resizedImageStream, i) => {
    zip.append(resizedImageStream, { name: `${i}.png` });
  });

  zip.finalize();

  return zip;
}

resizeAndZip().pipe(fs.createWriteStream(__dirname + '/output.zip'));

In the example you can see that I've also imported imagemagick-stream. If this library is used as the resizer instead of sharp everything works fine. I'm not sure whether this is a compatibility problem or if I'm using sharp incorrectly.

@lovell
Copy link
Owner

lovell commented Jan 6, 2017

Hello, I think the resizedImageStream instances have yet to enter flowing mode. This happens automagically when something else pipes data out, but archiver doesn't appear to do this for you.

Something like the following should work:

    zip.append(resizedImageStream.resume(), { name: `${i}.png` });

Perhaps sharp's constructor jsdoc should include this information? I'd be happy to accept a PR if you'd like to make the docs clearer.

@lovell lovell added the question label Jan 6, 2017
@danhaller
Copy link
Author

Thanks for the reply. Adding resume is definitely an improvement, the zip file now completes with all the files. It appears that most of the files are corrupt, however, when the archive is unzipped. I'm trying not to get too stackoverflow here but is this likely to be a sharp issue or an archiver issue?

I'll definitely add a PR if you think it would be useful. Perhaps it would be more useful in archiver though, as I imagine most people using sharp will be using pipe?

@lovell
Copy link
Owner

lovell commented Jan 9, 2017

I can partially recreate this - the first two images in the zip file are OK but the third/last image has zero length.

To investigate further we'd need to look at the internals of the archiver module to see what custom logic it uses to handle a ReadableStream instance vs what Node gives you for "free" when using pipe.

@lovell
Copy link
Owner

lovell commented Jan 17, 2017

I think I might have identified a race condition where the Writable side of a sharp instance can emit the "finish" event before the Readable side has started flowing. My next step is to isolate it to a repeatable test case.

(I had a quick look at the archiver code and it does call pipe internally in a sequential, queued manner. My advice to resume a Readable stream before calling zip.append is probably wrong.)

@lovell
Copy link
Owner

lovell commented Jan 22, 2017

Commit d8df503 adds a test and the fix for the case where the Readable side of the Duplex Stream starts flowing only after the Writable side of the Duplex Stream has emitted the "finish" event. Thanks for reporting this.

@lovell lovell added this to the v0.17.2 milestone Jan 22, 2017
@lovell lovell added bug and removed triage labels Jan 22, 2017
@sinegovsky-ivan
Copy link

The same issue with fix from "Commit d8df503" - 90% images have zero size :(

@lovell
Copy link
Owner

lovell commented Jan 27, 2017

@danhaller Have you been able to verify the proposed fix?

@danhaller
Copy link
Author

@lovell Thanks for investigating this and sorry I haven't got back to you sooner - busy week. I've just tried the latest commit with the code example above and I'm still seeing the same problem unfortunately - the first two images are zipped correctly but any others are not.

@lovell
Copy link
Owner

lovell commented Jan 27, 2017

Do you still see the problem if you remove the finalize call from resizeAndZip and instead call it after you've started piping data to the filesystem?

...
const zip = resizeAndZip();
zip.pipe(fs.createWriteStream(__dirname + '/output.zip'));
zip.finalize();

@danhaller
Copy link
Author

Ah I forgot to remove the resume. After removing that it works correctly. I've also tried it out in the actual service that I'm writing and it works there too.

Thanks again for fixing.

@lovell
Copy link
Owner

lovell commented Jan 27, 2017

Thanks @danhaller.

@sinegovsky-ivan if you're still having problems, please open a new issue with a code sample that fails, preferably with error handling/logging on the streams.

@sinegovsky-ivan
Copy link

I also removed "resume" and all ok now.

Thank you for your work!

@rn4391
Copy link

rn4391 commented Feb 7, 2017

If this is complete and working fine, then 0.17.2 can be released?

@lovell
Copy link
Owner

lovell commented Feb 11, 2017

v0.17.2 now available via npm, thanks for reporting this!

@lovell lovell closed this as completed Feb 11, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants