开发者

How to speed up/optimize file write in my program

Ok. I am supposed to write a program to take a 20 GB file as input with 1,000,000,000 records and create some kind of an index for faster access. I have basically decided to split the 1 bil records into 10 buckets and 10 sub-buckets within those. I am calculating two hash values for the record to locate its appropriate bucket. Now, i create 10*10 files, one for each sub-bucket. And as i hash the record from the input file, i decide which of the 100 files it goes to; then append the record's offset to that particular file. I have tested this with a sample file with 10,000 records. I have repeated the process 10 times. Effectively emulating a 100,000 record file. For this it takes me around 18 seconds. This means its gonna take me forever to do the same for a 1 bil record file. Is there anyway i can speed up/ optimize my writing. And i am going through all this because i can't store all the records in main memory.

import java.io.*;

// PROGRAM DOES THE FOLLOWING

// 1. READS RECORDS FROM A FILE. 
// 2. CALCULATES TWO SETS OF HASH VALUES N, M
// 3. APPENDING THE OFFSET OF THAT RECORD IN THE ORIGINAL FILE TO ANOTHER FILE "NM.TXT"  i.e REPLACE THE VALUES OF N AND M.
// 4.  

class storage
{

        public static int siz=10;
    public static FileWriter[][] f;
}

class proxy
{

    static String[][] virtual_buffer;
    public static void main(String[] args) throws Exception
    {
        virtual_buffer = new String[storage.siz][storage.siz]; // TEMPORARY STRING BUFFER TO REDUCE WRITES


        String s,tes;


        for(int y=0;y<storage.siz;y++)
            {
            for(int z=0;z<storage.siz;z++)
            {
            virtual_buffer[y][z]="";            // INITIALISING ALL ELEMENTS TO ZERO
        开发者_StackOverflow社区    }
            }




        int offset_in_file = 0;          

        long start = System.currentTimeMillis();


              // READING FROM THE SAME IP FILE 20 TIMES TO EMULATE A SINGLE BIGGER FILE OF SIZE 20*IP FILE
        for(int h=0;h<20;h++){                

         BufferedReader in = new BufferedReader(new FileReader("outTest.txt"));
         while((s = in.readLine() )!= null)
         {
            tes = (s.split(";"))[0];
            int n = calcHash(tes); // FINDING FIRST HASH VALUE
            int m = calcHash2(tes); // SECOND HASH
            index_up(n,m,offset_in_file); // METHOD TO WRITE TO THE APPROPRIATE FILE I.E NM.TXT
                offset_in_file++;

         }
         in.close();
        }



         System.out.println(offset_in_file);
         long end = System.currentTimeMillis();
         System.out.println((end-start));
    }




    static int calcHash(String s) throws Exception

    {
        char[] charr = s.toCharArray();;
        int i,tot=0;
        for(i=0;i<charr.length;i++)
        {
            if(i%2==0)tot+= (int)charr[i];
        }
        tot = tot % storage.siz;
        return tot;
    }




       static int calcHash2(String s) throws Exception

    {
        char[] charr = s.toCharArray();
        int i,tot=1;
        for(i=0;i<charr.length;i++)
        {
            if(i%2==1)tot+= (int)charr[i];
        }   
        tot = tot % storage.siz;
        if (tot<0)
            tot=tot*-1;
        return tot;
    }


      static void index_up(int a,int b,int off) throws Exception
   {
    virtual_buffer[a][b]+=Integer.toString(off)+"'"; // THIS BUFFER STORES THE DATA TO BE WRITTEN
    if(virtual_buffer[a][b].length()>2000)           // TO A FILE BEFORE WRITING TO IT, TO REDUCE NO. OF WRITES
    {                                                .

    String file = "c:\\adsproj\\"+a+b+".txt";   
    new writethreader(file,virtual_buffer[a][b]);    // DOING THE ACTUAL WRITE PART IN A THREAD.
    virtual_buffer[a][b]="";


    }
    }
}



class writethreader implements Runnable
{
    Thread t;
    String name, data;
    writethreader(String name, String data)
    {
        this.name = name;
        this.data = data;

        t = new Thread(this);
        t.start();
    }

    public void run()
    {
        try{


        File f = new File(name);
        if(!f.exists())f.createNewFile();   
        FileWriter fstream = new FileWriter(name,true);  //APPEND MODE 
        fstream.write(data);
        fstream.flush(); fstream.close();
            }
        catch(Exception e){}
    }
}


Consider using VisualVM to pinpoint the bottlenecks. Everything else below is based on guesswork - and performance guesswork is often really, really wrong.

I think you have two issues with your write strategy.

The first is that you're starting a new thread on each write; the second is that you're re-opening the file on each write.

The thread problem is especially bad, I think, because I don't see anything preventing one thread writing on a file from overlapping with another. What happens then? Frankly, I don't know - but I doubt it's good.

Consider, instead, creating an array of open files for all 100. Your OS may have a problem with this - but I think probably not. Then create a queue of work for each file. Create a set of worker threads (100 is too many - think 10 or so) where each "owns" a set of files that it loops through, outputting and emptying the queue for each file. Pay attention to the interthread interaction between queue reader and writer - use an appropriate queue class.


I would throw away the entire requirement and use a database.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜