Fast uncertainty quantification in EZ cognitive models

Abstract

Classical approaches to uncertainty quantification in cognitive modeling rely on computationally expensive Monte Carlo methods or resampling of raw trial data, which creates a barrier for real-time analysis and large-scale studies. We present a computationally efficient bootstrap method that operates directly on summary statistics, exploiting the synthetic likelihood structure of a small class of cognitive models that includes the simple diffusion model and the circular diffusion model in addition to signal detection and multinomial processing trees. The method relies on a numerical transformation-of-variables technique in which known sampling distributions of summary statistics of behavior are parametrically bootstrapped, after which the resampled statistics are transformed to parameter estimates with a known analytical system. This approach does not require additional assumptions beyond those already made by the models themselves, but achieves over 1000-fold speed improvements over already efficient fully Bayesian methods. The proposed method makes real-time uncertainty quantification accessible and enables new applications in adaptive testing, meta-analyses, and exploratory data analysis.

Bibtex

@article{vandekerckhove_fox:preprint:quantification,
    title   = {{F}ast uncertainty quantification in {E}{Z} cognitive models},
    author  = {Vandekerckhove, Joachim and Fox, Elizabeth L.},
    year    = {preprint}
}