A file object that has already been compressed to its personal maximum (as many PDF may already have been coded, compacted, compressed, and "crypted") can not be made any smaller by any significant amount unless some component is removed, thus the original qualities or functions destroyed.
As an Example take a similar size file as the original (1.5 to 2MB).
This one is 1.82 MB (1,916,023 bytes). The Postscript source was only 1.21 KB. So surly it should be possible to reduce towards that smaller size?
Well it is soon clear on opening the PDF is has 4096 pages and removing any single page would fail to maintain its function.
WE CAN compress it some more via say an online compressor.
Your PDF are now 17% smaller! 1.83 MB >> 1.53 MB.
Which was achieved by optimisation (number of components /Size 12293 reduced to /Size 8413) NOT by compression which is the same compression (A mix of Zip & deFlate).
I also know that can be bettered info: optimized 4096 streams, kept 4096 #orig, means there is only now one stream per page but by add one more wrapper /Size 8414 can reduce that down to a file size now of 1.03 MB (1,081,466 bytes).
Decompressed the total objects are /Size 8412 = 3.13 MB (3,286,658 bytes).
Still Functional at about 32.9%, However it will never be under the OP desired 1.00 MB without some function loss.