Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations

Alan Grossfield, Daniel M. Zuckerman

Research output: Chapter in Book/Report/Conference proceedingChapter

231 Scopus citations


Growing computing capacity and algorithmic advances have facilitated the study of increasingly large biomolecular systems at longer timescales. However, with these larger, more complex systems come questions about the quality of sampling and statistical convergence. What size systems can be sampled fully? If a system is not fully sampled, can certain "fast variables" be considered well converged? How can one determine the statistical significance of observed results? The present review describes statistical tools and the underlying physical ideas necessary to address these questions. Basic definitions and ready-to-use analyses are provided, along with explicit recommendations. Such statistical analyses are of paramount importance in establishing the reliability of simulation data in any given study.

Original languageEnglish (US)
Title of host publicationAnnual Reports in Computational Chemistry
EditorsRalph Wheeler
Number of pages26
StatePublished - 2009
Externally publishedYes

Publication series

NameAnnual Reports in Computational Chemistry
ISSN (Print)1574-1400


  • block averaging
  • convergence
  • correlation time
  • equilibrium ensemble
  • ergodicity
  • error analysis
  • principal component
  • sampling quality

ASJC Scopus subject areas

  • Chemistry(all)
  • Computational Mathematics


Dive into the research topics of 'Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations'. Together they form a unique fingerprint.

Cite this