Show simple item record

dc.contributor.authorMinkoff, Alan S.en_US
dc.date.accessioned2004-05-28T19:26:23Z
dc.date.available2004-05-28T19:26:23Z
dc.date.issued1981-02en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/5172
dc.description.abstractEvaluation of public programming currently tends toward plans that are set in advance of any sampling and adhered to throughout. Because increments in the knowledge profile during the course of an evaluation might beckon adjustment of the working procedure, fixed evaluation methodology may be cost-inefficient. It is desired to develop a methodology that is adaptive to changes in the knowledge profile. This might be most easily accomplished by borrowing ideas from some of the disciplines in which relevant problems occur. The most promising fields for this task include classical and Bayesian statistics, reliability theory, and dynamic programming. This paper reviews the techniques in classical statistics that seem most apt for handling the problem of adaptive changes in an evaluation to updated knowledge profiles, and considers the paths along which future research ought to be conducted.en_US
dc.format.extent1744 bytes
dc.format.extent2156379 bytes
dc.format.mimetypeapplication/pdf
dc.language.isoen_USen_US
dc.publisherMassachusetts Institute of Technology, Operations Research Centeren_US
dc.relation.ispartofseriesOperations Research Center Working Paper;OR 110-81en_US
dc.titlePreliminary Survey of Classical Statistical Techniques for Incorporation into Adaptive Evaluation Methodologyen_US
dc.typeWorking Paperen_US
dc.contributor.departmentMassachusetts Institute of Technology. Operations Research Center


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record