Except my error, i think this is not possible, without lostalterate your file in losing the ability to rebuild and make adecompress the big file compress ok because you will lose the metadata (header and tail) from the first big file compress and those metadata don't exist for each of your small file.
But you could create a wrapper that could do...
- (optional) compress the big file
- split your big file into 100 small chunk
- compress each of your small chunck in gzip
- decompress each chunck in gzip.
- concat chunck into the big file.
- (optional) decompress the big file
Notice : I am not sure in terme of your purpose... save storage ? save time network transmission ? limite space system ? what is your root need ?
Best Regards