开发者

sorting a file containing huge amount of data

Consider a file containing N words with one word per line.The file is too big so whole of it connot be read in memory at one time.

My ans: Divide the file into k chunks.So size of each chunk x = N/k

Read one chunk into memory at a time and sort it and write back to the file.Sort all k chunks.

Now do a k way merge.

Analying total time complexity. How can i do it?

Time for sorting each chunk = xlogx (assuming i use quick sort)

Time for merging k chunks = klogk (is it??)

So total time complexity =??

Am week a开发者_StackOverflow社区t analying time complexity


Each chunk is of size N/k and the number of chunks is k.

So, total time complexity would be

Reading of N/k chunks + Sorting each of those chunks i.e., O( N/k klogk) + Merging each [part] of those k chunks O( Nk ) + File Writing.

So, it would be IO Time + O(Nlogk)

I am learning these things too...so it would be great if someone can analyze and correct me if im wrong here..

-Pavan.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜