Resource title

Dynamic prioritization of projects at a scarce resource (RV of 2000/64/TM)

Resource image

image for OpenScout resource :: Dynamic prioritization of projects at a scarce resource (RV of 2000/64/TM)

Resource description

The authors develop a dynamic prioritization policy to optimally allocate a scarce resource among K projects, only one of which can be worked on at a time. A project is represented by a Markov decision process, with states corresponding to the performance of the project output. Each project must go through a fixed number of stages. Payoffs accrue at the end of each project, depending on the performance of its output. If delays penalize all projects equally (e.g., discounting) the problem is a "multi-armed bandit" (MAB). Working on the project with the highest expected reward is then optimal and maximizes the value of the entire portfolio. When projects have differing delay costs, the problem becomes a "restless bandit". The authors characterize the optimal policy for the case when the delay cost is a non-linear increasing fraction of the potential payoff. It is optimal to work on the project with the highest expected delay loss as if the other project was completely finished first. If projects are subject to stochastic schedule delays, the policy is to work on the project with the highest expected delay loss, including its expected schedule delay. These results generalize the cµ rule of dynamic scheduling to a situation with non-linear delay costs and recourse during processing.

Resource author

Resource publisher

Resource publish date

Resource language

en

Resource content type

application/pdf

Resource resource URL

http://flora.insead.edu/fichiersti_wp/inseadwp2001/2001-85.pdf

Resource license

Copyright INSEAD. All rights reserved