I am tring to create random 1G test file via dd command.
dd status=progress if=/dev/zero of=/tmp/testfile.zer bs=100M count=10
dd status=progress if=/dev/urandom of=/tmp/testfile1.ran bs=100M count=10
dd status=progress if=/dev/urandom of=/tmp/testfile2.ran bs=100M count=20
The output is:
-rw-rw-r-- 1 dorinand dorinand 320M dub 21 12:37 testfile1.ran
-rw-rw-r-- 1 dorinand dorinand 640M dub 21 12:37 testfile2.ran
-rw-rw-r-- 1 dorinand dorinand 1000M dub 21 12:37 testfile.zer
Why is the output testfile generate from /dev/urandom three times smaller? I would expect that the size of testfile1.ran will be 1000M and size of testfile2.ran will be 2000M. Could anybody why this happening? How should I generate random testfile?
dddoes not reliably copy data from its input to its output. Your question is basically a duplicate of [ Why does dd from /dev/random give different file sizes?](unix.stackexchange.com/questions/32988/…), with the twist that in the past you could get away withddon Linux's/dev/urandom, but it seems this is no longer true.