Perturbation theory for stochastic learning dynamics

Todd K. Leen, Robert Friel

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

On-line machine learning and biological spike-timing-dependent plasticity (STDP) rules both generate Markov chains for the synaptic weights. We give a perturbation expansion (in powers of the learning rate) for the dynamics that, unlike the usual approximation by a Fokker-Planck equation (FPE), is rigorous. Our approach extends the related system size expansion by giving an expansion for the probability density as well as its moments. Applied to two observed STDP learning rules, our approach provides better agreement with Monte-Carlo simulations than either the FPE or a simple linearized theory. The approach is also applicable to stochastic neural dynamics.

Original languageEnglish (US)
Title of host publication2011 International Joint Conference on Neural Networks, IJCNN 2011 - Final Program
Pages2031-2038
Number of pages8
DOIs
StatePublished - 2011
Event2011 International Joint Conference on Neural Network, IJCNN 2011 - San Jose, CA, United States
Duration: Jul 31 2011Aug 5 2011

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Other

Other2011 International Joint Conference on Neural Network, IJCNN 2011
Country/TerritoryUnited States
CitySan Jose, CA
Period7/31/118/5/11

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Perturbation theory for stochastic learning dynamics'. Together they form a unique fingerprint.

Cite this