开发者

How can I create 1000 files that I can use to test a script?

I would开发者_运维知识库 like to create 1000+ text files with some text to test a script, how to create this much if text files at a go using shell script or Perl. Please could anyone help me?


for i in {0001..1000}
do
  echo "some text" > "file_${i}.txt"
done

or if you want to use Python <2.6

for x in range(1000):
    open("file%03d.txt" % x,"w").write("some text")


#!/bin/bash
seq 1 1000 | split -l 1 -a 3 -d - file

Above will create 1000 files with each file having a number from 1 to 1000. The files will be named from file000 to file999.


In Perl:

use strict;
use warnings;

for my $i (1..1000) {
   open(my $out,">",sprintf("file%04d",$i));
   print $out "some text\n";
   close $out;
}

Why the first 2 lines? Because they are good practice so I use them even in 1-shot programs like these.

Regards, Offer


For variety:

#!/usr/bin/perl

use strict; use warnings;
use File::Slurp;

write_file $_, "$_\n" for map sprintf('file%04d.txt', $_), 1 .. 1000;


#!/bin/bash

for suf in $(seq -w 1000)
do
        cat << EOF > myfile.$suf
        this is my text file
        there are many like it
        but this one is mine.
EOF
done


I don't know in shell or perl but in python would be:

#!/usr/bin/python

for i in xrange(1000):
    with open('file%0.3d' %i,'w') as fd:
        fd.write('some text')

I think is pretty straightforward what it does.


You can use only Bash with no externals and still be able to pad the numbers so the filenames sort properly (if needed):

read -r -d '' text << 'EOF'
Some text for
my files
EOF

for i in {1..1000}
do
    printf -v filename "file%04d" "$i"
    echo "$text" > "$filename"
done

Bash 4 can do it like this:

for filename in file{0001..1000}; do echo $text > $filename; done

Both versions produce filenames like "file0001" and "file1000".


Just take any big file that has more than 1000 bytes (for 1000 files with content). There are lots of them on your computer. Then do (for example):

split -n 1000 /usr/bin/firefox

This is instantly fast.

Or a bigger file:

split -n 10000 /usr/bin/cat

This took only 0.253 seconds for creating 10000 files.

For 100k files:

split -n 100000 /usr/bin/gcc

Only 1.974 seconds for 100k files with about 5 bytes each.

If you only want files with text, look at your /etc directory. Create one million text files with almost random text:

split -n 1000000 /etc/gconf/schemas/gnome-terminal.schemas

20.203 seconds for 1M files with about 2 bytes each. If you divide this big file in only 10k parts it only takes 0.220 seconds and each file has 256 bytes of text.


Here is a short command-line Perl program.

perl -E'say $_ $_ for grep {open $_, ">f$_"} 1..1000'

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜