Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations

Alan Grossfield, Daniel M. Zuckerman

Research output: Chapter in Book/Report/Conference proceedingChapter

248 Scopus citations

Abstract

Growing computing capacity and algorithmic advances have facilitated the study of increasingly large biomolecular systems at longer timescales. However, with these larger, more complex systems come questions about the quality of sampling and statistical convergence. What size systems can be sampled fully? If a system is not fully sampled, can certain "fast variables" be considered well converged? How can one determine the statistical significance of observed results? The present review describes statistical tools and the underlying physical ideas necessary to address these questions. Basic definitions and ready-to-use analyses are provided, along with explicit recommendations. Such statistical analyses are of paramount importance in establishing the reliability of simulation data in any given study.

Original languageEnglish (US)
Title of host publicationAnnual Reports in Computational Chemistry
EditorsRalph Wheeler
Pages23-48
Number of pages26
DOIs
StatePublished - 2009
Externally publishedYes

Publication series

NameAnnual Reports in Computational Chemistry
Volume5
ISSN (Print)1574-1400

Keywords

  • block averaging
  • convergence
  • correlation time
  • equilibrium ensemble
  • ergodicity
  • error analysis
  • principal component
  • sampling quality

ASJC Scopus subject areas

  • General Chemistry
  • Computational Mathematics

Fingerprint

Dive into the research topics of 'Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations'. Together they form a unique fingerprint.

Cite this