Access the full text.
Sign up today, get DeepDyve free for 14 days.
W. Maass, T. Natschläger, H. Markram (2002)
Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on PerturbationsNeural Computation, 14
H. Jaeger, H. Haas (2004)
Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless CommunicationScience, 304
Yuan and Leibold BMC Neuroscience 2011, 12(Suppl 1):P196 http://www.biomedcentral.com/1471-2202/12/S1/P196 POSTER PRESENTATION Open Access Capacity measurement of a recurrent inhibitory neural network Chun-Wei Yuan , Christian Leibold From Twentieth Annual Computational Neuroscience Meeting: CNS*2011 Stockholm, Sweden. 23-28 July 2011 References Inhibitory neurons are considered to play a central role 1. Maass W, Natschläger T, Markram H: Real-time computing without stable as rhythm generator and in shaping feed-forward recep- states: a new framework for neural computation based on tive fields. While much attention has been paid to such perturbations. Neural Comput 2002, 14:2531-2560. 2. Jäger H, Haas H: Harnessing nonlinearity: predicting chaotic systems and effectson excitatoryneurons,littleis donetostudy saving energy in wireless communication. Science 2004, 304:78-80. theseinhibitoryneurons ’ ability to directly process doi:10.1186/1471-2202-12-S1-P196 information. Here we present a model that investigates Cite this article as: Yuan and Leibold: Capacity measurement of a the computational capacity of a recurrent inhibitory recurrent inhibitory neural network. BMC Neuroscience 2011 12(Suppl 1): neural network. P196. Our work focuses on quantifying the performance of a recurrent network of inhibitory integrate-and-fire neu- rons in canonical classification tasks. The model begins with parallel independent excitatory Poisson inputs con- nected to the recurrent network. Then, the network out- put is feed-forwardly directed to a read-out linear classifier. An identical network, but with zero synaptic connectivity, is set up for benchmarking. The analysis is then conducted by comparing the capacities of both set- ups, at 95% accuracy, as a function of parameters such as inhibitory weight, network size, etc. It is found that, in general, neurons with faster time constants provide better computational power. Further- more, there is an optimum weight amongst the inhibi- tory neurons that yields at least a 20% network performance improvement (Figure 1). The inhibition plays the role of suppressing overdriven, stereotypical firing behavior to render efficient sparse encoding of temporal information. This illustrates that the nonli- nearity of a recurrent, dynamical network possesses Figure 1 Classification accuracy vs. number of training patterns, for more computational capacity than a simple feed-forward a fully connected inhibitory network of N = 100 neurons. Performance of the zero-connectivity network is shown as a linear expansion provided by the non-connected net- benchmark for comparison. The N-dimensional network output is work [1,2]. binned into n patterns, with a bin size of 30 ms, and the linear classifier is trained to separate the first n/2 patterns from the latter n/2. The input weight, with exponentially decreasing post-synaptic Published: 18 July 2011 current (psc, with 100 ms time constant), is sub-threshold. Inhibitory network weights are reported in relation to the input weight, * Correspondence: yuan@bio.lmu.de though the inhibitory psc time constants are much smaller (8 ms). Division of Neurobiology, Department of Biology II, Ludwig Maximilians Universität, 82152 Martinsried, Germany Full list of author information is available at the end of the article © 2011 Yuan and Leibold; licensee BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
BMC Neuroscience – Springer Journals
Published: Jul 18, 2011
You can share this free article with as many people as you like with the url below! We hope you enjoy this feature!
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.