Chapter 3 Greedy Algorithms

Content in this chapter was originally prepared and edited by Prof Ian Sanders and Prof Hima Vadapalli.

In this section, we will consider the class of algorithms known as greedy algorithms. Greedy algorithms are known as such because they search for a global solution by making the best local decision at any point in time. As an example, imagine we are trying to find the maximum value attained by a function, and we begin at some point on the function curve. A greedy approach would simply be to head “uphill” according to the gradient. As you can imagine, this hill-climbing approach will find a maxima, but it may be a local maxima depending on the shape of the function. As we will see, greedy algorithms do not always produce the optimal result, but they are often fast to run and can produce good-enough results sometimes! Furthermore, you have already encountered greedy algorithms, such as finding the minimum-weighted spanning tree and Huffman coding!

3.1 Summary Lecture

The video below summarises the content in this chapter. You may either watch it first, or read through the notes and watch it as a recap.


3.2 Optimal Service

We will discuss greedy algorithms in the context of the Optimal Service problem. The optimal service problem asks: “Find the minimum total waiting time required to service a group of people who can be served in any order and where each person has a known fixed service time.”

Consider for example the following situation.

Person Service Time
Kabza 3
Zinaro 2
Zakwe 4
Bucy 1

If we deal with these people in the order given in the table then the total service time is derived as follows. While Kabza is being served, Zinaro, Zakwe and Bucy have to wait. This results in a waiting time of \(3 \times 3 = 9\) (three people waiting for 3 units of time – the time required to serve Kabza). Then while Zinaro is being served Zakwe and Bucy have to wait. This results in a further waiting time of \(2 \times 2 = 4\) for a total waiting time so far of 13 units. Then while Zakwe is served Bucy has to wait for 4 units of time. The total waiting time if the people are served in this order is thus 17 units of time.

Note that in general the total waiting time for a particular ordering of \(n\) customers is

\[ \sum\limits_{i=1}^{n-1} (n-i) \times t(i) \]

where \(t(i)\) is the time taken to serve the \(i\)th person.

3.2.1 Brute Force Solution

Here we would calculate the waiting time for each ordering of the people and then choose the shortest time. Note that with 4 people there are \(4! = 24\) possible orderings. In general for \(n\) there would be \(n!\) orderings so this would not be an efficient way to solve the problem.

3.2.2 Greedy Solution

In a greedy approach we make choices that seem to be the best possible at every stage. In this problem we are trying to minimise the total waiting time of the customers. An obvious greedy strategy is thus to repeatedly choose to serve the person who is going to be the quickest to deal with in the hope that doing so will result in the other people waiting for a shorter time. It turns out that this is a good strategy — this greedy method will always return the minimum total waiting time!

Note that not all problems can be solved by a greedy approach and even those problems that can be often require some thought to choose the right approach. If we can solve a problem using a greedy approach we must still prove that the greedy approach always works. In many hard problems (e.g. NP-Complete problems) a greedy approach is often a good heuristic for finding an approximate solution.

Before we prove that the strategy presented above will always find an optimal solution let us consider some properties of problems which can be solved using a greedy approach.

3.3 Greedy Algorithm Properties

Greedy algorithms are based on the approach of finding a solution to an optimization problem by making a number of successive choices each of which seems to be the “best” choice at the time. Unfortunately this strategy is not guaranteed to produce an optimal solution for all problems. We have seen above that greedy algorithms do sometimes give the optimal solutions. So the question arises as to if (or how) one can tell in advance whether a greedy algorithm will solve a particular optimisation problem. It is often quite difficult to do this but in general we would look for two things in the problem being considered.

  1. The greedy choice property. Here we need to consider whether we can get to a globally optimal solution by making a locally optimal (greedy) choice. In this case, we never need to backtrack or reconsider our previous choices!
  2. Optimal substructure. Here we need to consider if an optimal solution to the problem contains optimal solutions to subproblems within it.

These two ideas are related to each other and often require a fair amount of thought for any problem.

3.4 Proof of Correctness for Optimal Service

To prove that the greedy approach discussed above will lead us to the right answer, let us begin by considering a situation where we have two customers \(x\) and \(y\) and \(t(x) < t(y)\). Our greedy approach would pick the order \(x\), \(y\) in this case. Obviously this is the correct solution – the total waiting time is the smaller of the two service times. This tells us that for any pair of customers we should service the one with the smallest service time first. We could generalise this approach, which means that we should choose the person from the full set of customers with the smallest service time first and then would have a smaller problem of the same type to solve for the rest of the customers. We would then solve the smaller problem in the same way.

Clearly in this case the local choice leads to a global solution (the greedy choice property) and, in addition, the optimal solution for \(n\) customers contains an optimal solution for \(n − 1\) customers (excluding the one with the shortest service time). And the optimal solution for those customers contains an optimal solution for a set of \(n − 2\) customers excluding the one with the shortest service time of those customers and so on.

3.5 Making Change

Let us now consider the problem of making change. Given an infinite supply of coins of given denominations, find the minimum number of coins needed to make up a particular amount of change. The denominations of the coins available to us are 1, 2, 5, 10, 20, 50 and 100 (cents).

For example, we can make up 66 by using \(20 + 20 + 20 + 2 + 2 + 2\), or \(50 + 10 + 2 + 2 + 2\), or \(20 + 20 + 20 + 5 + 1\), or \(50 + 10 + 5 + 1\), etc. It turns out that 4 coins is the minimum number required so \(50 + 10 + 5 + 1\) is a good solution.

A brute force approach to finding the minimum number of coins required to make up the number k would be to calculate the values returned by all combinations of coins from 1 coin up to k coins (using more than k coins cannot possibly give a solution that adds up to only k) and then to return the minimum number of coins that total k. This would clearly not be very efficient.

3.5.1 Brute Force Solution

A brute force approach to finding the minimum number of coins required to make up the number \(k\) would be to calculate the values returned by all combinations of coins from 1 coin up to \(k\) coins (using more than \(k\) coins cannot possibly give a solution that adds up to only \(k\)) and then to return the minimum number of coins that total \(k\). This would clearly not be very efficient.

3.5.2 Greedy Solution

In a greedy approach we make choices that take us as close to a solution as possible every time we make a choice. In this problem we are trying to minimise the number of coins which we use. A greedy strategy is thus to repeatedly choose the coin with the biggest denomination which is less than the change needed in the hope that doing so will result in us choosing the fewest coins overall. For the example we were looking at earlier (making up 66) we would first choose 50 (as it is the biggest denomination smaller than 66). This is then one coin and our problem now becomes that of finding the smallest number of coins to make up 16 (\(66 − 50\)). Using the same approach we would then choose a 10 which would leave us with 6. We would then choose a 5 and finally a 1. We thus have a solution which only requires 4 coins. We cannot do better in this example.

Try some other examples to confirm that this strategy always gives the required minimum solution with this coin set. Note that for other coin sets the greedy approach might not always give a minimum solution. In fact, it is possible to test whether a coin system is canonical (that is, whether the greedy algorithm always solves its change-making problem optimally) in polynomial time.