{"title":"Neurally plausible mechanisms for learning selective and invariant representations.","authors":"Fabio Anselmi, Ankit Patel, Lorenzo Rosasco","doi":"10.1186/s13408-020-00088-7","DOIUrl":"https://doi.org/10.1186/s13408-020-00088-7","url":null,"abstract":"<p><p>Coding for visual stimuli in the ventral stream is known to be invariant to object identity preserving nuisance transformations. Indeed, much recent theoretical and experimental work suggests that the main challenge for the visual cortex is to build up such nuisance invariant representations. Recently, artificial convolutional networks have succeeded in both learning such invariant properties and, surprisingly, predicting cortical responses in macaque and mouse visual cortex with unprecedented accuracy. However, some of the key ingredients that enable such success-supervised learning and the backpropagation algorithm-are neurally implausible. This makes it difficult to relate advances in understanding convolutional networks to the brain. In contrast, many of the existing neurally plausible theories of invariant representations in the brain involve unsupervised learning, and have been strongly tied to specific plasticity rules. To close this gap, we study an instantiation of simple-complex cell model and show, for a broad class of unsupervised learning rules (including Hebbian learning), that we can learn object representations that are invariant to nuisance transformations belonging to a finite orthogonal group. These findings may have implications for developing neurally plausible theories and models of how the visual cortex or artificial neural networks build selectivity for discriminating objects and invariance to real-world nuisance transformations.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"12"},"PeriodicalIF":2.3,"publicationDate":"2020-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00088-7","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38284789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A sub-Riemannian model of the visual cortex with frequency and phase.","authors":"E Baspinar, A Sarti, G Citti","doi":"10.1186/s13408-020-00089-6","DOIUrl":"https://doi.org/10.1186/s13408-020-00089-6","url":null,"abstract":"<p><p>In this paper, we present a novel model of the primary visual cortex (V1) based on orientation, frequency, and phase selective behavior of V1 simple cells. We start from the first-level mechanisms of visual perception, receptive profiles. The model interprets V1 as a fiber bundle over the two-dimensional retinal plane by introducing orientation, frequency, and phase as intrinsic variables. Each receptive profile on the fiber is mathematically interpreted as rotated, frequency modulated, and phase shifted Gabor function. We start from the Gabor function and show that it induces in a natural way the model geometry and the associated horizontal connectivity modeling of the neural connectivity patterns in V1. We provide an image enhancement algorithm employing the model framework. The algorithm is capable of exploiting not only orientation but also frequency and phase information existing intrinsically in a two-dimensional input image. We provide the experimental results corresponding to the enhancement algorithm.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"11"},"PeriodicalIF":2.3,"publicationDate":"2020-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00089-6","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38209111","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Methods to assess binocular rivalry with periodic stimuli.","authors":"Farzaneh Darki, James Rankin","doi":"10.1186/s13408-020-00087-8","DOIUrl":"https://doi.org/10.1186/s13408-020-00087-8","url":null,"abstract":"<p><p>Binocular rivalry occurs when the two eyes are presented with incompatible stimuli and perception alternates between these two stimuli. This phenomenon has been investigated in two types of experiments: (1) Traditional experiments where the stimulus is fixed, (2) eye-swap experiments in which the stimulus periodically swaps between eyes many times per second (Logothetis et al. in Nature 380(6575):621-624, 1996). In spite of the rapid swapping between eyes, perception can be stable for many seconds with specific stimulus parameter configurations. Wilson introduced a two-stage, hierarchical model to explain both types of experiments (Wilson in Proc. Natl. Acad. Sci. 100(24):14499-14503, 2003). Wilson's model and other rivalry models have been only studied with bifurcation analysis for fixed inputs and different types of dynamical behavior that can occur with periodically forcing inputs have not been investigated. Here we report (1) a more complete description of the complex dynamics in the unforced Wilson model, (2) a bifurcation analysis with periodic forcing. Previously, bifurcation analysis of the Wilson model with fixed inputs has revealed three main types of dynamical behaviors: Winner-takes-all (WTA), Rivalry oscillations (RIV), Simultaneous activity (SIM). Our results have revealed richer dynamics including mixed-mode oscillations (MMOs) and a period-doubling cascade, which corresponds to low-amplitude WTA (LAWTA) oscillations. On the other hand, studying rivalry models with numerical continuation shows that periodic forcing with high frequency (e.g. 18 Hz, known as flicker) modulates the three main types of behaviors that occur with fixed inputs with forcing frequency (WTA-Mod, RIV-Mod, SIM-Mod). However, dynamical behavior will be different with low frequency periodic forcing (around 1.5 Hz, so-called swap). In addition to WTA-Mod and SIM-Mod, cycle skipping, multi-cycle skipping and chaotic dynamics are found. This research provides a framework for either assessing binocular rivalry models to check consistency with empirical results, or for better understanding neural dynamics and mechanisms necessary to implement a minimal binocular rivalry model.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"10"},"PeriodicalIF":2.3,"publicationDate":"2020-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00087-8","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38052872","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian Bick, Marc Goodfellow, Carlo R Laing, Erik A Martens
{"title":"Understanding the dynamics of biological and neural oscillator networks through exact mean-field reductions: a review.","authors":"Christian Bick, Marc Goodfellow, Carlo R Laing, Erik A Martens","doi":"10.1186/s13408-020-00086-9","DOIUrl":"10.1186/s13408-020-00086-9","url":null,"abstract":"<p><p>Many biological and neural systems can be seen as networks of interacting periodic processes. Importantly, their functionality, i.e., whether these networks can perform their function or not, depends on the emerging collective dynamics of the network. Synchrony of oscillations is one of the most prominent examples of such collective behavior and has been associated both with function and dysfunction. Understanding how network structure and interactions, as well as the microscopic properties of individual units, shape the emerging collective dynamics is critical to find factors that lead to malfunction. However, many biological systems such as the brain consist of a large number of dynamical units. Hence, their analysis has either relied on simplified heuristic models on a coarse scale, or the analysis comes at a huge computational cost. Here we review recently introduced approaches, known as the Ott-Antonsen and Watanabe-Strogatz reductions, allowing one to simplify the analysis by bridging small and large scales. Thus, reduced model equations are obtained that exactly describe the collective dynamics for each subpopulation in the oscillator network via few collective variables only. The resulting equations are next-generation models: Rather than being heuristic, they exactly link microscopic and macroscopic descriptions and therefore accurately capture microscopic properties of the underlying system. At the same time, they are sufficiently simple to analyze without great computational effort. In the last decade, these reduction methods have become instrumental in understanding how network structure and interactions shape the collective dynamics and the emergence of synchrony. We review this progress based on concrete examples and outline possible limitations. Finally, we discuss how linking the reduced models with experimental data can guide the way towards the development of new treatment approaches, for example, for neurological disease.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"9"},"PeriodicalIF":2.3,"publicationDate":"2020-05-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7253574/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37983291","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spatially extended balanced networks without translationally invariant connectivity.","authors":"Christopher Ebsch, Robert Rosenbaum","doi":"10.1186/s13408-020-00085-w","DOIUrl":"https://doi.org/10.1186/s13408-020-00085-w","url":null,"abstract":"<p><p>Networks of neurons in the cerebral cortex exhibit a balance between excitation (positive input current) and inhibition (negative input current). Balanced network theory provides a parsimonious mathematical model of this excitatory-inhibitory balance using randomly connected networks of model neurons in which balance is realized as a stable fixed point of network dynamics in the limit of large network size. Balanced network theory reproduces many salient features of cortical network dynamics such as asynchronous-irregular spiking activity. Early studies of balanced networks did not account for the spatial topology of cortical networks. Later works introduced spatial connectivity structure, but were restricted to networks with translationally invariant connectivity structure in which connection probability depends on distance alone and boundaries are assumed to be periodic. Spatial connectivity structure in cortical network does not always satisfy these assumptions. We use the mathematical theory of integral equations to extend the mean-field theory of balanced networks to account for more general dependence of connection probability on the spatial location of pre- and postsynaptic neurons. We compare our mathematical derivations to simulations of large networks of recurrently connected spiking neuron models.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"8"},"PeriodicalIF":2.3,"publicationDate":"2020-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00085-w","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37932716","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Geometry of color perception. Part 1: structures and metrics of a homogeneous color space.","authors":"Edoardo Provenzi","doi":"10.1186/s13408-020-00084-x","DOIUrl":"https://doi.org/10.1186/s13408-020-00084-x","url":null,"abstract":"<p><p>This is the first half of a two-part paper dealing with the geometry of color perception. Here we analyze in detail the seminal 1974 work by H.L. Resnikoff, who showed that there are only two possible geometric structures and Riemannian metrics on the perceived color space [Formula: see text] compatible with the set of Schrödinger's axioms completed with the hypothesis of homogeneity. We recast Resnikoff's model into a more modern colorimetric setting, provide a much simpler proof of the main result of the original paper, and motivate the need of psychophysical experiments to confute or confirm the linearity of background transformations, which act transitively on [Formula: see text]. Finally, we show that the Riemannian metrics singled out by Resnikoff through an axiom on invariance under background transformations are not compatible with the crispening effect, thus motivating the need of further research about perceptual color metrics.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"7"},"PeriodicalIF":2.3,"publicationDate":"2020-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00084-x","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37928960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Correction to: Linking demyelination to compound action potential dispersion with a spike-diffuse-spike approach.","authors":"Richard Naud, André Longtin","doi":"10.1186/s13408-020-00083-y","DOIUrl":"https://doi.org/10.1186/s13408-020-00083-y","url":null,"abstract":"<p><p>Following publication of the original article (Naud and Longtin in J Math Neurosci 9:3, 2019), the authors noticed a mistake in the first paragraph within \"Altered propagation\".</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"6"},"PeriodicalIF":2.3,"publicationDate":"2020-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00083-y","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37855406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mesoscopic population equations for spiking neural networks with synaptic short-term plasticity.","authors":"Valentin Schmutz, Wulfram Gerstner, Tilo Schwalger","doi":"10.1186/s13408-020-00082-z","DOIUrl":"https://doi.org/10.1186/s13408-020-00082-z","url":null,"abstract":"<p><p>Coarse-graining microscopic models of biological neural networks to obtain mesoscopic models of neural activities is an essential step towards multi-scale models of the brain. Here, we extend a recent theory for mesoscopic population dynamics with static synapses to the case of dynamic synapses exhibiting short-term plasticity (STP). The extended theory offers an approximate mean-field dynamics for the synaptic input currents arising from populations of spiking neurons and synapses undergoing Tsodyks-Markram STP. The approximate mean-field dynamics accounts for both finite number of synapses and correlation between the two synaptic variables of the model (utilization and available resources) and its numerical implementation is simple. Comparisons with Monte Carlo simulations of the microscopic model show that in both feedforward and recurrent networks, the mesoscopic mean-field model accurately reproduces the first- and second-order statistics of the total synaptic input into a postsynaptic neuron and accounts for stochastic switches between Up and Down states and for population spikes. The extended mesoscopic population theory of spiking neural networks with STP may be useful for a systematic reduction of detailed biophysical models of cortical microcircuits to numerically efficient and mathematically tractable mean-field models.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"5"},"PeriodicalIF":2.3,"publicationDate":"2020-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-00082-z","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37807047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Benoit Duchet, Gihan Weerasinghe, Hayriye Cagnan, Peter Brown, Christian Bick, Rafal Bogacz
{"title":"Phase-dependence of response curves to deep brain stimulation and their relationship: from essential tremor patient data to a Wilson-Cowan model.","authors":"Benoit Duchet, Gihan Weerasinghe, Hayriye Cagnan, Peter Brown, Christian Bick, Rafal Bogacz","doi":"10.1186/s13408-020-00081-0","DOIUrl":"10.1186/s13408-020-00081-0","url":null,"abstract":"<p><p>Essential tremor manifests predominantly as a tremor of the upper limbs. One therapy option is high-frequency deep brain stimulation, which continuously delivers electrical stimulation to the ventral intermediate nucleus of the thalamus at about 130 Hz. Constant stimulation can lead to side effects, it is therefore desirable to find ways to stimulate less while maintaining clinical efficacy. One strategy, phase-locked deep brain stimulation, consists of stimulating according to the phase of the tremor. To advance methods to optimise deep brain stimulation while providing insights into tremor circuits, we ask the question: can the effects of phase-locked stimulation be accounted for by a canonical Wilson-Cowan model? We first analyse patient data, and identify in half of the datasets significant dependence of the effects of stimulation on the phase at which stimulation is provided. The full nonlinear Wilson-Cowan model is fitted to datasets identified as statistically significant, and we show that in each case the model can fit to the dynamics of patient tremor as well as to the phase response curve. The vast majority of top fits are stable foci. The model provides satisfactory prediction of how patient tremor will react to phase-locked stimulation by predicting patient amplitude response curves although they were not explicitly fitted. We also approximate response curves of the significant datasets by providing analytical results for the linearisation of a stable focus model, a simplification of the Wilson-Cowan model in the stable focus regime. We report that the nonlinear Wilson-Cowan model is able to describe response to stimulation more precisely than the linearisation.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":"10 1","pages":"4"},"PeriodicalIF":2.3,"publicationDate":"2020-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7105566/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9413882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sparse identification of contrast gain control in the fruit fly photoreceptor and amacrine cell layer.","authors":"Aurel A Lazar, Nikul H Ukani, Yiyin Zhou","doi":"10.1186/s13408-020-0080-5","DOIUrl":"https://doi.org/10.1186/s13408-020-0080-5","url":null,"abstract":"<p><p>The fruit fly's natural visual environment is often characterized by light intensities ranging across several orders of magnitude and by rapidly varying contrast across space and time. Fruit fly photoreceptors robustly transduce and, in conjunction with amacrine cells, process visual scenes and provide the resulting signal to downstream targets. Here, we model the first step of visual processing in the photoreceptor-amacrine cell layer. We propose a novel divisive normalization processor (DNP) for modeling the computation taking place in the photoreceptor-amacrine cell layer. The DNP explicitly models the photoreceptor feedforward and temporal feedback processing paths and the spatio-temporal feedback path of the amacrine cells. We then formally characterize the contrast gain control of the DNP and provide sparse identification algorithms that can efficiently identify each the feedforward and feedback DNP components. The algorithms presented here are the first demonstration of tractable and robust identification of the components of a divisive normalization processor. The sparse identification algorithms can be readily employed in experimental settings, and their effectiveness is demonstrated with several examples.</p>","PeriodicalId":54271,"journal":{"name":"Journal of Mathematical Neuroscience","volume":" ","pages":"3"},"PeriodicalIF":2.3,"publicationDate":"2020-02-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s13408-020-0080-5","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37637630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}