Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations

Alan Grossfield, Daniel Zuckerman

Research output: Chapter in Book/Report/Conference proceedingChapter

169 Citations (Scopus)

Abstract

Growing computing capacity and algorithmic advances have facilitated the study of increasingly large biomolecular systems at longer timescales. However, with these larger, more complex systems come questions about the quality of sampling and statistical convergence. What size systems can be sampled fully? If a system is not fully sampled, can certain "fast variables" be considered well converged? How can one determine the statistical significance of observed results? The present review describes statistical tools and the underlying physical ideas necessary to address these questions. Basic definitions and ready-to-use analyses are provided, along with explicit recommendations. Such statistical analyses are of paramount importance in establishing the reliability of simulation data in any given study.

Original languageEnglish (US)
Title of host publicationAnnual Reports in Computational Chemistry
EditorsRalph Wheeler
Pages23-48
Number of pages26
DOIs
StatePublished - Sep 3 2009
Externally publishedYes

Publication series

NameAnnual Reports in Computational Chemistry
Volume5
ISSN (Print)1574-1400

Fingerprint

Large scale systems
Sampling
Uncertainty
Statistical Convergence
Simulation
Statistical Significance
Recommendations
Complex Systems
Time Scales
Necessary
Computing
Review

Keywords

  • block averaging
  • convergence
  • correlation time
  • equilibrium ensemble
  • ergodicity
  • error analysis
  • principal component
  • sampling quality

ASJC Scopus subject areas

  • Chemistry(all)
  • Computational Mathematics

Cite this

Grossfield, A., & Zuckerman, D. (2009). Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations. In R. Wheeler (Ed.), Annual Reports in Computational Chemistry (pp. 23-48). (Annual Reports in Computational Chemistry; Vol. 5). https://doi.org/10.1016/S1574-1400(09)00502-7

Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations. / Grossfield, Alan; Zuckerman, Daniel.

Annual Reports in Computational Chemistry. ed. / Ralph Wheeler. 2009. p. 23-48 (Annual Reports in Computational Chemistry; Vol. 5).

Research output: Chapter in Book/Report/Conference proceedingChapter

Grossfield, A & Zuckerman, D 2009, Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations. in R Wheeler (ed.), Annual Reports in Computational Chemistry. Annual Reports in Computational Chemistry, vol. 5, pp. 23-48. https://doi.org/10.1016/S1574-1400(09)00502-7
Grossfield A, Zuckerman D. Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations. In Wheeler R, editor, Annual Reports in Computational Chemistry. 2009. p. 23-48. (Annual Reports in Computational Chemistry). https://doi.org/10.1016/S1574-1400(09)00502-7
Grossfield, Alan ; Zuckerman, Daniel. / Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations. Annual Reports in Computational Chemistry. editor / Ralph Wheeler. 2009. pp. 23-48 (Annual Reports in Computational Chemistry).
@inbook{b0fb338cedb94ccdaff353ba87f03e47,
title = "Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations",
abstract = "Growing computing capacity and algorithmic advances have facilitated the study of increasingly large biomolecular systems at longer timescales. However, with these larger, more complex systems come questions about the quality of sampling and statistical convergence. What size systems can be sampled fully? If a system is not fully sampled, can certain {"}fast variables{"} be considered well converged? How can one determine the statistical significance of observed results? The present review describes statistical tools and the underlying physical ideas necessary to address these questions. Basic definitions and ready-to-use analyses are provided, along with explicit recommendations. Such statistical analyses are of paramount importance in establishing the reliability of simulation data in any given study.",
keywords = "block averaging, convergence, correlation time, equilibrium ensemble, ergodicity, error analysis, principal component, sampling quality",
author = "Alan Grossfield and Daniel Zuckerman",
year = "2009",
month = "9",
day = "3",
doi = "10.1016/S1574-1400(09)00502-7",
language = "English (US)",
isbn = "9780444533593",
series = "Annual Reports in Computational Chemistry",
pages = "23--48",
editor = "Ralph Wheeler",
booktitle = "Annual Reports in Computational Chemistry",

}

TY - CHAP

T1 - Chapter 2 Quantifying Uncertainty and Sampling Quality in Biomolecular Simulations

AU - Grossfield, Alan

AU - Zuckerman, Daniel

PY - 2009/9/3

Y1 - 2009/9/3

N2 - Growing computing capacity and algorithmic advances have facilitated the study of increasingly large biomolecular systems at longer timescales. However, with these larger, more complex systems come questions about the quality of sampling and statistical convergence. What size systems can be sampled fully? If a system is not fully sampled, can certain "fast variables" be considered well converged? How can one determine the statistical significance of observed results? The present review describes statistical tools and the underlying physical ideas necessary to address these questions. Basic definitions and ready-to-use analyses are provided, along with explicit recommendations. Such statistical analyses are of paramount importance in establishing the reliability of simulation data in any given study.

AB - Growing computing capacity and algorithmic advances have facilitated the study of increasingly large biomolecular systems at longer timescales. However, with these larger, more complex systems come questions about the quality of sampling and statistical convergence. What size systems can be sampled fully? If a system is not fully sampled, can certain "fast variables" be considered well converged? How can one determine the statistical significance of observed results? The present review describes statistical tools and the underlying physical ideas necessary to address these questions. Basic definitions and ready-to-use analyses are provided, along with explicit recommendations. Such statistical analyses are of paramount importance in establishing the reliability of simulation data in any given study.

KW - block averaging

KW - convergence

KW - correlation time

KW - equilibrium ensemble

KW - ergodicity

KW - error analysis

KW - principal component

KW - sampling quality

UR - http://www.scopus.com/inward/record.url?scp=69349101503&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=69349101503&partnerID=8YFLogxK

U2 - 10.1016/S1574-1400(09)00502-7

DO - 10.1016/S1574-1400(09)00502-7

M3 - Chapter

AN - SCOPUS:69349101503

SN - 9780444533593

T3 - Annual Reports in Computational Chemistry

SP - 23

EP - 48

BT - Annual Reports in Computational Chemistry

A2 - Wheeler, Ralph

ER -