Fog Creek Software
Discussion Board




Resource Allocation Software

I'm looking for pointers to papers or existing software that does resource allocation with constraints.

Basically, I have a limited set of "processors" that I want to perform "jobs." Each job requires between 1 and 5 processors and have varying durations. I know what each type of job requires, but I don't know when and for how long the jobs will be requested. At a minimum, I'd like to be able to allocate the jobs among the processors. When there are more jobs than processors, I'd like to be able to stop excecution of the lowest priority job(s) and re-assign the processors.

To make things more interesting, there are other resources that I would like to be able to optimize on. For example, two processors may run on the same machine, so they share a limited memory and CPU power. Also, I'd like to be able to make optimizations based on affinity of jobs to particular machine that runs more than one processor. For instance, if a job last ran on a given machine, it's cheaper to run it on the same machine, again.

This sounds like a problem that's been solved many times before. I would like to be able to actually look at how they did it, not just use existing software.

Thanks,

igor
Thursday, July 22, 2004

It's a big field. http://www.eil.utoronto.ca/profiles/chris/chris.papers.html is just one example.

Christopher Wells
Thursday, July 22, 2004

I'm not sure it's been entirely solved. It's complexity is at least np-complete. Search for topological sorting.

Dino
Thursday, July 22, 2004

It's a multi-variable multi-equation maximization/minimization problem.  You can use Excel's Solver to model the scenarios (and play with the equations).  Unfortunately there aren't any short ways to solve the equations, and iterative methods seem to dominate the field.

Lou
Thursday, July 22, 2004

Sounds like a Job Shop problem.

Edward
Thursday, July 22, 2004

Igor, you are describing an operating system. They do what you are describing, plus a whole lot more. It is an NP hard task to efficiently do what you want.

At a past company, I built something very crude that does something similar. I called it the "redundant array of inexpensive computers" because it used a pile of undepreciated pentium 75/100s. The code had to be migrated from Access97 really fast (some of the managers wanted to use Office2k which broke a lot of tables, and all the code, in Access97). Also the daily processing was creaping up to 20 hours of crunching on a 700mhz p3 to handle 24 hours of data.

All of the jobs were refactored into DCOM/ActiveX dlls, or NT services (who just woke up every 5 min to see if it was time to make the donuts). Whichever made more sense for that task. With NT, createobject can take 2 parameters, one is the machine.

The scheduler used a scoreboard (table in sql server) that kept track of when JobX was supposed to run, and which of the array of cheapies was able to run JobX. When it was time to run JobX, it also recorded who was running the job.

Was this too much detail? Not enough? Wrong subject?

Peter
Friday, July 23, 2004

*  Recent Topics

*  Fog Creek Home