Document Type

Article

Publication Date

6-1-2017

Publication Title

Neural Computation

ISSN

1530-888X

Volume

29

Issue

6

First Page

1561

Last Page

1610

DOI

https://doi.org/10.1162/NECO_a_00957

PubMed ID

28333591

Abstract

In a constantly changing world, animals must account for environmental volatility when making decisions. To appropriately discount older, irrelevant information, they need to learn the rate at which the environment changes. We develop an ideal observer model capable of inferring the present state of the environment along with its rate of change. Key to this computation is an update of the posterior probability of all possible change point counts. This computation can be challenging, as the number of possibilities grows rapidly with time. However, we show how the computations can be simplified in the continuum limit by a moment closure approximation. The resulting low-dimensional system can be used to infer the environmental state and change rate with accuracy comparable to the ideal observer. The approximate computations can be performed by a neural network model via a rate-correlation-based plasticity rule. We thus show how optimal observers accumulate evidence in changing environments and map this computation to reduced models that perform inference using plausible neural mechanisms.

Comments

This is a post-print version of "Evidence Accumulation and Change Rate Inference in Dynamic Environments" published in Neural Computation.

© 2017 Massachusetts Institute of Technology

https://www.mitpressjournals.org/loi/neco

Share

COinS