Decoding Neural Signals with a Compact and Interpretable Convolutional Neural Network

Artur Petrosyan, Mikhail Lebedev, Alexey Ossadtchi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

In this work, we motivate and present a novel compact CNN. For the architectures that combine the adaptation in both space and time, we describen a theoretically justified approach to interpreting the temporal and spatial weights. We apply the proposed architecture to Berlin BCI IV competition and our own datasets to decode electrocorticogram into finger kinematics. Without feature engineering our architecture delivers similar or better decoding accuracy as compared to the BCI competition winner. After training the network, we interpret the solution (spatial and temporal convolution weights) and extract physiologically meaningful patterns.

Original languageEnglish
Title of host publicationAdvances in Neural Computation, Machine Learning, and Cognitive Research IV - Selected Papers from the 22nd International Conference on Neuroinformatics, 2020
EditorsBoris Kryzhanovsky, Witali Dunin-Barkowski, Vladimir Redko, Yury Tiumentsev
PublisherSpringer Science and Business Media Deutschland GmbH
Pages420-428
Number of pages9
ISBN (Print)9783030605766
DOIs
Publication statusPublished - 2021
Externally publishedYes
Event22nd International Conference on Neuroinformatics, 2020 - Moscow, Russian Federation
Duration: 12 Oct 202016 Oct 2020

Publication series

NameStudies in Computational Intelligence
Volume925 SCI
ISSN (Print)1860-949X
ISSN (Electronic)1860-9503

Conference

Conference22nd International Conference on Neuroinformatics, 2020
Country/TerritoryRussian Federation
CityMoscow
Period12/10/2016/10/20

Keywords

  • Convolutional neural network
  • Ecog
  • Limb kinematics decoding
  • Machine learning

Fingerprint

Dive into the research topics of 'Decoding Neural Signals with a Compact and Interpretable Convolutional Neural Network'. Together they form a unique fingerprint.

Cite this