Publications

2019

Combrisson, Etienne, Raphael Vallat, Christian O’Reilly, Mainak Jas, Annalisa Pascarella, Anne-Lise Saive, Thomas Thiery, et al. (2019) 2019. “Visbrain: A Multi-Purpose GPU-Accelerated Open-Source Suite for Multimodal Brain Data Visualization.”. Frontiers in Neuroinformatics 13: 14. https://doi.org/10.3389/fninf.2019.00014.

We present Visbrain, a Python open-source package that offers a comprehensive visualization suite for neuroimaging and electrophysiological brain data. Visbrain consists of two levels of abstraction: (1) objects which represent highly configurable neuro-oriented visual primitives (3D brain, sources connectivity, etc.) and (2) graphical user interfaces for higher level interactions. The object level offers flexible and modular tools to produce and automate the production of figures using an approach similar to that of Matplotlib with subplots. The second level visually connects these objects by controlling properties and interactions through graphical interfaces. The current release of Visbrain (version 0.4.2) contains 14 different objects and three responsive graphical user interfaces, built with PyQt: Signal, for the inspection of time-series and spectral properties, Brain for any type of visualization involving a 3D brain and Sleep for polysomnographic data visualization and sleep analysis. Each module has been developed in tight collaboration with end-users, i.e., primarily neuroscientists and domain experts, who bring their experience to make Visbrain as transparent as possible to the recording modalities (e.g., intracranial EEG, scalp-EEG, MEG, anatomical and functional MRI). Visbrain is developed on top of VisPy, a Python package providing high-performance 2D and 3D visualization by leveraging the computational power of the graphics card. Visbrain is available on Github and comes with a documentation, examples, and datasets (http://visbrain.org).

Iavarone, Elisabetta, Jane Yi, Ying Shi, Bas-Jan Zandt, Christian O’Reilly, Werner Van Geit, Christian Rössert, Henry Markram, and Sean L Hill. (2019) 2019. “Experimentally-Constrained Biophysical Models of Tonic and Burst Firing Modes in Thalamocortical Neurons.”. PLoS Computational Biology 15 (5): e1006753. https://doi.org/10.1371/journal.pcbi.1006753.

Somatosensory thalamocortical (TC) neurons from the ventrobasal (VB) thalamus are central components in the flow of sensory information between the periphery and the cerebral cortex, and participate in the dynamic regulation of thalamocortical states including wakefulness and sleep. This property is reflected at the cellular level by the ability to generate action potentials in two distinct firing modes, called tonic firing and low-threshold bursting. Although the general properties of TC neurons are known, we still lack a detailed characterization of their morphological and electrical properties in the VB thalamus. The aim of this study was to build biophysically-detailed models of VB TC neurons explicitly constrained with experimental data from rats. We recorded the electrical activity of VB neurons (N = 49) and reconstructed morphologies in 3D (N = 50) by applying standardized protocols. After identifying distinct electrical types, we used a multi-objective optimization to fit single neuron electrical models (e-models), which yielded multiple solutions consistent with the experimental data. The models were tested for generalization using electrical stimuli and neuron morphologies not used during fitting. A local sensitivity analysis revealed that the e-models are robust to small parameter changes and that all the parameters were constrained by one or more features. The e-models, when tested in combination with different morphologies, showed that the electrical behavior is substantially preserved when changing dendritic structure and that the e-models were not overfit to a specific morphology. The models and their analysis show that automatic parameter search can be applied to capture complex firing behavior, such as co-existence of tonic firing and low-threshold bursting over a wide range of parameter sets and in combination with different neuron morphologies.

Shardlow, Matthew, Meizhi Ju, Maolin Li, Christian O’Reilly, Elisabetta Iavarone, John McNaught, and Sophia Ananiadou. (2019) 2019. “A Text Mining Pipeline Using Active and Deep Learning Aimed at Curating Information in Computational Neuroscience.”. Neuroinformatics 17 (3): 391-406. https://doi.org/10.1007/s12021-018-9404-y.

The curation of neuroscience entities is crucial to ongoing efforts in neuroinformatics and computational neuroscience, such as those being deployed in the context of continuing large-scale brain modelling projects. However, manually sifting through thousands of articles for new information about modelled entities is a painstaking and low-reward task. Text mining can be used to help a curator extract relevant information from this literature in a systematic way. We propose the application of text mining methods for the neuroscience literature. Specifically, two computational neuroscientists annotated a corpus of entities pertinent to neuroscience using active learning techniques to enable swift, targeted annotation. We then trained machine learning models to recognise the entities that have been identified. The entities covered are Neuron Types, Brain Regions, Experimental Values, Units, Ion Currents, Channels, and Conductances and Model organisms. We tested a traditional rule-based approach, a conditional random field and a model using deep learning named entity recognition, finding that the deep learning model was superior. Our final results show that we can detect a range of named entities of interest to the neuroscientist with a macro average precision, recall and F1 score of 0.866, 0.817 and 0.837 respectively. The contributions of this work are as follows: 1) We provide a set of Named Entity Recognition (NER) tools that are capable of detecting neuroscience entities with performance above or similar to prior work. 2) We propose a methodology for training NER tools for neuroscience that requires very little training data to get strong performance. This can be adapted for any sub-domain within neuroscience. 3) We provide a small corpus with annotations for multiple entity types, as well as annotation guidelines to help others reproduce our experiments.

O’Reilly, Christian, Florian Chapotot, Francesca Pittau, Nathalie Mella, and Fabienne Picard. (2019) 2019. “Nicotine Increases Sleep Spindle Activity.”. Journal of Sleep Research 28 (4): e12800. https://doi.org/10.1111/jsr.12800.

Studies have shown that both nicotine and sleep spindles are associated with enhanced memorisation. Further, a few recent studies have shown how cholinergic input through nicotinic and muscarinic receptors can trigger or modulate sleep processes in general, and sleep spindles in particular. To better understand the interaction between nicotine and sleep spindles, we compared in a single blind randomised study the characteristics of sleep spindles in 10 healthy participants recorded for 2 nights, one with a nicotine patch and one with a sham patch. We investigated differences in sleep spindle duration, amplitude, intra-spindle oscillation frequency and density (i.e. spindles per min). We found that under nicotine, spindles are more numerous (average increase: 0.057 spindles per min; 95% confidence interval: [0.025-0.089]; p = .0004), have higher amplitude (average amplification: 0.260 μV; confidence interval: [0.119-0.402]; p = .0032) and last longer (average lengthening: 0.025 s; confidence interval: [0.017-0.032]; p = 2.7e-11). These results suggest that nicotine can increase spindle activity by acting on nicotinic acetylcholine receptors, and offer an attractive hypothesis for common mechanisms that may support memorisation improvements previously reported to be associated with nicotine and sleep spindles.

2017