开发者

Algorithm- minimizing total tardiness

I am trying to implement an exact algorithm of minimizing total tardiness for single machine. I was searching in the web to get an idea how I can implement it using dynamic programming. I read the paper of Lawler who proposed a PSEUDOPOLYNOMIAL algorithm back in 7开发者_StackOverflow7. However, still could not able to map it in java or c# code.

Could you please help me providing some reference how to implement this exact algorithm effectively?

Edit-1: @bcat: not really. I need to implement it for our software. :( still I am not able to find any guidance how to implement it. Greedy one is easy to implement , but result of scheduling is not that impressive.

Kind Regards,

Xiaon


You could have specified the exact characteristics of your particular problem along with limits for different variables and the overall characteristics of the input.

Without this, I assume you use this definition:

You want to minimize the total tardiness in a single machine scheduling problem with n independent jobs, each having its processing time and due date.

What you want to do is to pick up a subset of the jobs so that they do not overlap (single machine available) and you can also pick the order in which you do them, keeping the due dates.

I guess the first step to do is to sort by the due dates, it seems that there is no benefit of sorting them in some different way.

Next what is left is to pick the subset. Here is where the dynamic programming comes to help. I'll try to describe the state and recursive relation.

State:

[current time][current job] = maximum number of jobs done

Relation:

You either process the current job and call

f(current time + processing_time_of[current job],current job +1)

or you skip the process and call

f(current time,current job +1)

finally you take the minimum of those 2 values returned by those calls and store it in state

[current time][current job]

And you start the recursion at time 0 and job 0.

By the way, greedy seems to be doing pretty well, check this out:

http://www.springerlink.com/content/f780526553631475/


For single machine, Longest Processing Time schedule (also known as Decreasing-Time Algorithm) is optimal if you know all jobs ahead of time and can sort them from longest to shortest (so all you need to do is sort).

As has been mentioned already, you've specified no information about your scheduling problem, so it is hard to help beyond this (as no universally optimal and efficient scheduler exists).


Maybe the Job Scheduling Section of the Stony Brook Algorithm Repository can help you.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜