| dc.contributor.advisor | Mansinghka, Vikash K. | |
| dc.contributor.advisor | Tenenbaum, Joshua B. | |
| dc.contributor.author | Dasgupta, Arijit | |
| dc.date.accessioned | 2025-11-17T19:06:50Z | |
| dc.date.available | 2025-11-17T19:06:50Z | |
| dc.date.issued | 2025-05 | |
| dc.date.submitted | 2025-08-14T19:31:42.260Z | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/163678 | |
| dc.description.abstract | Humans possess a remarkable capacity to track and predict the motion of objects even when visual information is temporarily absent. This thesis investigates how missing sensory evidence—such as during occlusion—alters current and future beliefs about object motion, and introduces an uncertainty-aware framework to model this process. A behavioral experiment was conducted in which participants continuously predicted the future destination of a ball moving in 2.5D environments with occlusion. Results demonstrate that participants dynamically updated their predictions throughout occlusion, exhibiting adaptive belief revision and physically grounded reasoning. To model this behavior, a structured Bayesian modeling and inference approach for joint tracking and prediction was developed that integrates perception, state estimation, and future prediction in a unified process. The approach, implemented via a Sequential Monte Carlo algorithm embedded within a GPU-accelerated and parallel probabilistic programming system, maintains time-varying beliefs over both present and future object states, conditioned on observed images. These belief states are explicitly represented in symbolic form, enabling interpretable, frame-by-frame introspection of uncertainty and prediction over time. When compared against human responses, the model closely matched the temporal evolution of time-aligned decisions and outperformed plausible alternative hypotheses that failed to reason during occlusion. These findings affirm that the absence of changing visual evidence does not engender a void in physical reasoning, but is evidence in itself—processed and revised through structured, probabilistic inference. By integrating probabilistic programming with human behavioral data through structured Bayesian modeling and inference, this thesis advances a computational account of intuitive physical reasoning and provides a foundation for building interpretable, uncertainty-aware AI systems that mirror human-like physical intelligence. | |
| dc.publisher | Massachusetts Institute of Technology | |
| dc.rights | In Copyright - Educational Use Permitted | |
| dc.rights | Copyright retained by author(s) | |
| dc.rights.uri | https://rightsstatements.org/page/InC-EDU/1.0/ | |
| dc.title | Uncertainty-aware Joint Physical Tracking and Prediction | |
| dc.type | Thesis | |
| dc.description.degree | S.M. | |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
| dc.identifier.orcid | https://orcid.org/0009-0003-0345-5529 | |
| mit.thesis.degree | Master | |
| thesis.degree.name | Master of Science in Electrical Engineering and Computer Science | |