Dropout-based active learning for regression

Evgenii Tsymbalov, Maxim Panov, Alexander Shapeev

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

12 Citations (Scopus)


Active learning is relevant and challenging for high-dimensional regression models when the annotation of the samples is expensive. Yet most of the existing sampling methods cannot be applied to large-scale problems, consuming too much time for data processing. In this paper, we propose a fast active learning algorithm for regression, tailored for neural network models. It is based on uncertainty estimation from stochastic dropout output of the network. Experiments on both synthetic and real-world datasets show comparable or better performance (depending on the accuracy metric) as compared to the baselines. This approach can be generalized to other deep learning architectures. It can be used to systematically improve a machine-learning model as it offers a computationally efficient way of sampling additional data.

Original languageEnglish
Title of host publicationAnalysis of Images, Social Networks and Texts - 7th International Conference, AIST 2018, Revised Selected Papers
EditorsAlexander Panchenko, Wil M. van der Aalst, Michael Khachay, Panos M. Pardalos, Vladimir Batagelj, Natalia Loukachevitch, Goran Glavaš, Dmitry I. Ignatov, Sergei O. Kuznetsov, Olessia Koltsova, Irina A. Lomazova, Andrey V. Savchenko, Amedeo Napoli, Marcello Pelillo
PublisherSpringer Verlag
Number of pages12
ISBN (Print)9783030110260
Publication statusPublished - 2018
Event7th International Conference on Analysis of Images, Social Networks and Texts, AIST 2018 - Moscow, Russian Federation
Duration: 5 Jul 20187 Jul 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11179 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference7th International Conference on Analysis of Images, Social Networks and Texts, AIST 2018
Country/TerritoryRussian Federation


  • Active learning
  • Dropout
  • Neural networks
  • Regression
  • Uncertainty quantification


Dive into the research topics of 'Dropout-based active learning for regression'. Together they form a unique fingerprint.

Cite this