Raymond Chen has a post today about making change, which reminded me of an interesting problem.
American currency (or most currencies, for that matter) has the desirable property that the greedy solution to giving change results in giving change using the minimal number of pieces. I.e., if I want to give you change using the minimal number or pieces, I can start with the largest denomination, give you as many as I can without giving you too much change, and then repeat this with each smaller denomination in turn. For example. if I want to give you $1.73 in change, I do something like:
- I give you 1 dollar bill; $0.73 remains.
- I give you 1 fifty-cent piece; $0.23 remains.
- I give you 0 quarters; $0.23 remains.
- I give you 2 dimes; $0.03 remains.
- I give you 0 nickels; $0.03 remains.
- I give you 3 pennies; we’re done.
This took 7 pieces, and you can do no better. This is not always the case: if you had a system with a penny, a dime, and a 7-cent piece, then the greedy algorithm would give you a 5-piece solution for $0.14, but giving two 7-cent pieces is better.
My question: Is there an easy way to describe the necessary and sufficient conditions for a system to exhibit the property that the greedy change-making algorthm always produces minimal results?
It would appear that there is not. Well, that’s not 100% clear, but apparently the best known algorithm for determining whether a set of denominations has this property is O(n^3) in the number of denominations [PDF]. (Jeffrey Shallit discuss this an some other change-making problems in a rather cute paper: What This Country Needs is an 18 cent Piece [PDF].)