I have a big .gz file. I would like to split it into 100 smaller gzip files, that can each be decompressed by itself. In other words: I am not looking for a way of chopping up the .gz file into chunks that would have to be put back together to be able to decompress it. I want to be able to decompress each of the smaller files independently.
Can it be done without recompressing the whole file?
Can it be done if the original file is compressed with --rsyncable? ("Cater better to the rsync program by periodically resetting the internal structure of the compressed data stream." sounds like these reset points might be good places to split at and probably prepend a header.)
Can it be done for any of the other compressed formats? I would imagine bzip2 would be doable - as it is compressed in blocks.
gzip --rsyncablegiven that “gunzip cannot tell the difference” (if you could find a place to split, you could tell that there is a place to split). It might be doable with bzip2 because of its peculiar block feature.gzip -d -c bigfile.gz.big.tar.gzbutbig-gzs.tar). Then all or only a few files can be extracted and decompressed. I have tried to extract the last file only in a tar-ball but I guess it can "fast forward" as a tape drive can.