I should probably know this since I took a ton of combinatorics in college, but is there a tractable algorithm for this problem:
AFAIK there is no known way to guarantee the most efficient schedule, like the graph search problem.
Maybe a copy of Sedgewick works would be useful ?
In the literature this is known as a "multiple knapsack" problem. As Matthew implied, finding an optimal solution is NP-hard. But you should be able to find good-enough solutions in a reasonable time. Google may help.
Did you state your problem as intended. Isn't it easy and obvious if you have *more* machines than tasks? Fastest machine gets longest atsk, and then in order down to slowest machine gets shortest task. Slowest m-n machines are unused. Total time is the largest speed * time product sinc e all are in parallel.
According to "Complexity and Approximation" the 'Minimum Processor Scheduling with Speed Factors' problem has no known polynomial time algorithm. However, an approximate solution can be computed in polynomial time where the time complexity is proportional not only to the size of the input, but also to the quality of the desired solution.
Not an expert in the field, but it sounds like a typical "job-shop" problem, rather than a "knapsack" problem.
Sorry you're right, it should be m < n, trivial for m >= n. And yeah I pretty much guessed that it was NP-hard, but just wanted to make sure. I do remember the knapsack problem phrase vaguely.
It's straightforward to iterate over all partitions. If you have m machines and n tasks, there are n^m partitionings. Let x enumerate the partitionings, x = 0 to (n^m)-1. In each partitioning x, task y goes on machine (x / y**m) mod m.
try this pseudocode on for size:
Fog Creek Home