The ASCII hex code for a zero is 0x30. Hence, you can print a zero by doing printf '\x30', and it will print a zero.
If you put this into a shell script called myScript.sh, and then execute ./myScript.sh, it will also print a zero.
But if you execute sh -c printf '\x30' or alternatively, sh myScript.sh, you will instead get the literal characters "\x30", rather than having it be interpreted as a single byte.
Why is that?
(Behavior has been observed on multiple machines, all of which I believe are running bash).