MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • Operations Research Center
  • Operations Research Center Working Papers
  • View Item
  • DSpace@MIT Home
  • Operations Research Center
  • Operations Research Center Working Papers
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Restless Bandits, Linear Programming Relaxations and a Primal-Dual Heuristic

Author(s)
Bertsimas, Dimitris J.; Nino-Mora, Jose
Thumbnail
DownloadOR-298-94.pdf (1.591Mb)
Metadata
Show full item record
Abstract
We propose a mathematical programming approach for the classical PSPACE - hard problem of n restless bandits in stochastic optimization. We introduce a series of n increasingly stronger linear programming relaxations, the last of which is exact and corresponds to the formulation of the problem as a Markov decision process that has exponential size, while other relaxations provide bounds and are efficiently solvable. We also propose a heuristic for solving the problem that naturally arises from the first of these relaxations and uses indices that are computed through optimal dual variables from the first relaxation. In this way we propose a policy and a suboptimality guarantee. We report computational results that suggest that the value of the proposed heuristic policy is extremely close to the optimal value. Moreover, the second order relaxation provides strong bounds for the optimal solution value.
Date issued
1994-08
URI
http://hdl.handle.net/1721.1/5378
Publisher
Massachusetts Institute of Technology, Operations Research Center
Series/Report no.
Operations Research Center Working Paper;OR 298-94

Collections
  • Operations Research Center Working Papers

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.