This is similar to issues in queuing theory where the issue is getting a mix of jobs done according to certain criteria. The "weighted heterogeneous job mix" has been studied and it is as much a bear as any other I've ever seen.

Generally, in queuing theory, you can choose for the MOST jobs in a fixed time period, or you can choose for the jobs by maximum priority (cost, in your case), or you can choose for the FEWEST (but implicitly, biggest) jobs. And there are a few more variants for which some solutions are known.

The issue we have here is that you have multiple criteria, which makes it an incredibly complex process to optimize. It doesn't help that since it is essentially a combinatorics problem, the time it will take to solve it might easily become a factorial, and you claim 1800+ possible orders. I won't even try to compute 1800 factorial, but let's say it is a BIG number. That means that mathematically, a "perfect" solution MIGHT exist but it might literally take years to find it.

The usual way to approach this in real-world computing that I have seen relates to fixing a hard goal - say, always pick highest priority (price) or always pick biggest job (biggest volume, in your case). There IS a queuing theory solution for optimizing largest jobs AND priority, but usually that means that the second criterion is a tie-breaker for cases where the first criterion isn't enough to help you make a decision.

You need to better define your selection criteria. This will probably start with a query to provide order for your selections, but will almost certainly include some code issues. You also need to consider deciding on some strict statements about what would be acceptable as a solution. Like, how close do you have to come to your boundaries?

Another thing to consider is that the question, as asked, may literally be insoluble using most algorithms I know, because you want to have the top three solutions. But most computer-based solutions of this type will be a bit TOO highly predictable. I.e. you can run the solution three time. You'll get the same answer three times. There will be no best, second-best, third-best solution. Which means what YOU want is to optimize the same load based on three __different__ goals to see if they differ in results. Which sometimes they won't.

Side note: Thanks for that Wikipedia link, MajP!

Side note for other readers not up on their computational theory: In computational complexity theory, a problem is NP-complete when it can be solved by a restricted class of brute force search algorithms and it can be used to simulate any other problem with a similar algorithm. (From Wikipedia.org)