1

I have a list of http urls http://host.com/url1 http://host.com/url2 http://host.com/url3

Each url is pointed to some file

Is there a way to compress all that files to single zip archive without downloading files locally? I don't see a way to do this with linux zip command. It allows to have list of files, but not list of urls

Any ideas?

1 Answer 1

2

Using ksh eg.

(
  for URL in http://host.com/url1 http://host.com/url2 http://host.com/url3
  do
    wget -O - -q "$URL"
  done
) | gzip -c > mysingle.zip

You now have all data zipped into one file, but not really an archive: you can't extract one of the files, you'd need to add a separator in the loop.

2
  • Interesting. But even if I added a separator, what would the file names inside the ZIP archive be? Commented Jan 19, 2022 at 12:50
  • 1
    @Marcel: it's NOT an archive, there are no filenames inside the zip, it is a compressed data stream. Commented Jan 19, 2022 at 13:02

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.