One of my favourite courses I have taken so far at the University of Waterloo is Scheduling (CO 454). The course had interesting and applicable material, a great prof, and did not require a lot of background knowledge. This post will be a crash course on scheduling! This introduction will not assume any background knowledge in optimization or scheduling, but it will use some basic discrete math and mathematical notation. If you hate math, stop reading now!
The study of scheduling is exactly what it sounds like. Given a set of jobs that need to be completed, you want to create a schedule for those jobs that optimizes an objective function.
An example of a scheduling problem is your homework assignments. You have four assignments due next week, and they all have different due dates. You might want to create a schedule for you to complete all assignments as soon as possible. Or it could be that you want to finish the more important assignments first, and work on the less important assignments if you have time. The order in which you tackle your assignments will depend on what you are optimizing for.
A General Scheduling Problem
A general scheduling problem consists of three items: jobs, machines, and an objective function.
A job j can have various attributes. It may have some (or all) of the following properties:
: processing time
: release time (the job can’t be processed before this)
: weight (importance of this job relative to other jobs)
… and more
The machines process the jobs. There may be one or several machines, and each machine may not be identical. (i.e. machine A might be twice as fast as processing jobs as machine B). We will only be dealing with one machine in this post.
The objective function is the mathematical function we are trying to optimize.
With the above definitions for a specific scheduling problem, we can create feasible schedules. For example, let’s say we have one machine, and three jobs, with processing times 1, 2, and 3. Then this is a Gantt chart of one feasible schedule:
Let us make the assumption that we will not have unnecessary pre-emption. This means that a machine will never be idle (free) if there are any jobs available to process. With this assumption, there are only finitely many feasible schedules to the above problem. Specifically, there are feasible schedules, as the only thing that characterizes a feasible schedule is the order that the jobs are placed. These are the 6 feasible schedules for the above problem:
This means that for any objective function, we can simply produce all feasible schedules (since there are only finitely many), and find the one with the best objective value. However, since there are feasible schedules, this process will take longer than we would like (what we want is polynomial time!!). For many objective functions, there exist very efficient algorithms to find optimal or near-optimal solutions.
Lateness and Tardiness
To do a simple example, we must first define the notion of lateness and tardiness.
Given a specific schedule, denote as the completion time of job .
Then, we can define the concept of lateness, , as
So a job’s lateness is 2 if the job was completed 2 time units after its deadline.
Note that with this definition, a job can have a negative lateness, which implies the job was early. This notion of a negative lateness may seem unintuitive; thus we create another definition called tardiness.
Tardiness is never negative. Hence it is equal to lateness if the job is late, but is 0 if the job is early or on time. Thus, for any specific job in a schedule.
Examples of Objective Functions
With the above definitions set, let’s try to minimize the following functions:
Given a specific schedule, is the value of the lateness of the job that is the most late. (Same for with tardiness.)
We denote and to be the value of the optimal solution for and respectively. (We know that the optimal values exist since there are only finitely many feasible schedules.)
Optimal implies optimal
Note that finding an optimal schedule for corresponds to an optimal solution for given the same problem.
- If , then for that schedule, which is optimal since .
- If , then and (this inequality always holds). Since we are minimizing and , we get that this is optimal.
However, the converse is not true. Finding an optimal solution for does not correspond to an optimal solution for given the same problem. Here is a counterexample:
With the above numbers, it turns out that any schedule gives an optimal . No matter how the jobs are placed, all jobs always finish by time 3. Since all the deadlines are , all jobs will always be early or on time. So for all feasible schedules.
However, since cares about how early a job is, there is only one optimal solution for , which is:
So it is not true that a schedule that optimizes also optimizes .
Optimal Algorithm for and
When given a set of jobs and deadlines, the most intuitive way to schedule them would be to schedule the one that has the earliest due date first. It turns out that this method gives the optimal schedule for both and .
Concretely, if is the due date of the job in a schedule, then a schedule such that yields an optimal schedule for and . This can be proven using contradiction.
This is great! Now we have a very efficient algorithm to optimize and for any given problem.
Unfortunately, those objective functions are not terribly useful. With the above objective functions, having one job late by 3 units and the rest of the jobs early, is worse than having all the jobs late by 2 time units. This does not seem entirely intuitive. Hence we can define a new set of objective functions: the sum of the lateness and tardiness for every job.
New Objective Functions: ,
To optimize , one simply has to order the jobs in non-decreasing processing time. (i.e. Do the shortest jobs first!) If is the processing time of the job in a schedule, a schedule is optimal if and only if . (Note that this is an if and only if, whereas the last algorithm was not.)
So optimizing turned out to be another very simple algorithm. What about ?
Finding the optimal solution for turns out to be NP-hard. Although lateness, , and tardiness, , are very similar functions, it turns out that tardiness is much harder to deal with. With a dynamic programming approach, the best we can get for is pseudo-polynomial time. I won’t go into details of this algorithm, but it is definitely much harder than solving .
The above is probably what you would learn in the first two classes if you were to take scheduling (CO 454). In the course, you learn about many more different scheduling problems and their corresponding algorithms/proofs to solve them. In addition, you learn about various techniques such as dynamic programming or approximation algorithms to solve the problems that have no efficient solution. The course had a very good mix of rigorous content, applicable material, and algorithms. I highly recommend it!