36

I am looking for a command to create multiple (thousands of) files containing at least 1KB of random data.

For example,

Name            size
file1.01        2K
file2.02        3K
file3.03        5K
etc.

How can I create many files like this?

3
  • What did you mean by “exclusive”? It doesn't make sense in context, so you probably used the wrong word. Commented May 1, 2015 at 21:54
  • I meant that files can not have the same content. Commented May 2, 2015 at 10:59
  • So, exclusive would have meant unique. Commented Feb 28, 2019 at 22:09

6 Answers 6

49

Since you don't have any other requirements, something like this should work:

#! /bin/bash
for n in {1..1000}; do
    dd if=/dev/urandom of=file$( printf %03d "$n" ).bin bs=1 count=$(( RANDOM + 1024 ))
done

(this needs bash at least for {1..1000}).

4
  • 2
    This needs bash for numerous reasons, including $((…)) and $RANDOM.  Even $(…) might not exist in every shell. Commented May 7, 2015 at 21:30
  • 2
    @G-Man, in any case, none of those features are specific to bash nor did they originate in bash ({1..1000} comes from zsh, for n in...; done and variable expansion comes from the Bourne shell, $(...), $((...)) and $RANDOM come ksh). The features that are not POSIX are {1..1000}, $RANDOM and /dev/urandom. Commented Mar 3, 2017 at 13:46
  • 1
    If you wanted 1..1000 to be constant-width you need "%04d" in which case bash or zsh can do {0001..1000} with no printf Commented Mar 3, 2017 at 16:32
  • 1
    Additional feature: I need a way to spread these files over lots of randomly named nested subdirectories. Commented Feb 14, 2018 at 18:09
16

A variation with seq, xargs, dd and shuf:

seq -w 1 10 | xargs -n1 -I% sh -c 'dd if=/dev/urandom of=file.% bs=$(shuf -i1-10 -n1) count=1024'

Explanation as requested per comments:

seq -w 1 10 prints a sequence of numbers from 01 to 10

xargs -n1 -I% executes the command sh -c 'dd ... % ...' for each sequence number replacing the % with it

dd if=/dev/urandom of=file.% bs=$(shuf ...) count=1024 creates the files feeded from /dev/urandom with 1024 blocks with a blocksize of

shuf -i1-10 -n1 a random value from 1 to 10

3
  • Wat does this exactly do? Commented Mar 3, 2017 at 13:00
  • 1
    @saru95 explanation added. Commented Mar 3, 2017 at 13:35
  • This creates 10 files with random size. Change "10" to the number of files wanted. Commented Jan 15, 2019 at 22:38
7

This uses a single pipeline and seems fairly fast, but has the limitation that all of the files are the same size

dd if=/dev/urandom bs=1024 count=10240 | split -a 4 -b 1k - file.

Explanation: Use dd to create 10240*1024 bytes of data; split that into 10240 separate files of 1k each (names will run from 'file.aaaa' through 'file.zzzz')

5

This will create 15 files each contains 1MB of random data:

for i in {001..015}; do head -c 1M </dev/urandom >randfile$i; done
2

You can do something like this:

#!/bin/bash
filecount=0
while [ $filecount -lt 10000 ] ; do
    filesize=$RANDOM
    filesize=$(($filesize+1024))
    base64 /dev/urandom | 
    head -c "$filesize" > /tmp/file${filecount}.$RANDOM
    ((filecount++))
done
2
  • I tried doing this but not worked. You might want to explain the parameters. :) Commented May 2, 2015 at 10:54
  • very fast thx @rahul Commented Apr 21, 2019 at 9:07
1

Similar to lcd047's answer (it works with bash, even git bash on Windows), with the addition that the names are also randomized and of 1MB in size (might be more efficient disk-wise):

#! /bin/bash
s=$(hostname).$(pwd).${RANDOM}
for n in {1..1000}; do
    x=$( printf %04d "$n" )
    dd if=/dev/urandom of=$(echo "${x}.$(date +%s).${s}" | sha256sum -z | awk '{print $1}')$( printf %03d "$n" ).bin bs=1 count=1048576
done

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.