| dc.contributor.author | Minkoff, Alan S. | en_US |
| dc.date.accessioned | 2004-05-28T19:26:23Z | |
| dc.date.available | 2004-05-28T19:26:23Z | |
| dc.date.issued | 1981-02 | en_US |
| dc.identifier.uri | http://hdl.handle.net/1721.1/5172 | |
| dc.description.abstract | Evaluation of public programming currently tends toward plans that are set in advance of any sampling and adhered to throughout. Because increments in the knowledge profile during the course of an evaluation might beckon adjustment of the working procedure, fixed evaluation methodology may be cost-inefficient. It is desired to develop a methodology that is adaptive to changes in the knowledge profile. This might be most easily accomplished by borrowing ideas from some of the disciplines in which relevant problems occur. The most promising fields for this task include classical and Bayesian statistics, reliability theory, and dynamic programming. This paper reviews the techniques in classical statistics that seem most apt for handling the problem of adaptive changes in an evaluation to updated knowledge profiles, and considers the paths along which future research ought to be conducted. | en_US |
| dc.format.extent | 1744 bytes | |
| dc.format.extent | 2156379 bytes | |
| dc.format.mimetype | application/pdf | |
| dc.language.iso | en_US | en_US |
| dc.publisher | Massachusetts Institute of Technology, Operations Research Center | en_US |
| dc.relation.ispartofseries | Operations Research Center Working Paper;OR 110-81 | en_US |
| dc.title | Preliminary Survey of Classical Statistical Techniques for Incorporation into Adaptive Evaluation Methodology | en_US |
| dc.type | Working Paper | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Operations Research Center | |