Distributed coordinate descent for L1-regularized logistic regression

Ilya Trofimov, Alexander Genkin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Logistic regression is a widely used technique for solving classification and class probability estimation problems in text mining, biometrics and clickstream data analysis. Solving logistic regression with L1-regularization in distributed settings is an important problem. This problem arises when training dataset is very large and cannot fit the memory of a single machine. We present d-GLMNET, a new algorithm solving logistic regression with L1-regularization in the distributed settings. We empirically show that it is superior over distributed online learning via truncated gradient.

Original languageEnglish
Title of host publicationAnalysis of Images, Social Networks and Texts - 4th International Conference, AIST 2015, Revised Selected Papers
EditorsValeri G. Labunets, Mikhail Yu. Khachay, Alexander Panchenko, Natalia Konstantinova, Dmitry I. Ignatov
PublisherSpringer Verlag
Pages243-254
Number of pages12
ISBN (Print)9783319261225
DOIs
Publication statusPublished - 2015
Externally publishedYes
Event4th International Conference on Analysis of Images, Social Networks and Texts, AIST 2015 - Yekaterinburg, Russian Federation
Duration: 9 Apr 201511 Apr 2015

Publication series

NameCommunications in Computer and Information Science
Volume542
ISSN (Print)1865-0929

Conference

Conference4th International Conference on Analysis of Images, Social Networks and Texts, AIST 2015
Country/TerritoryRussian Federation
CityYekaterinburg
Period9/04/1511/04/15

Keywords

  • L1-regularization
  • Large-scale learning
  • Logistic regression
  • Sparsity

Fingerprint

Dive into the research topics of 'Distributed coordinate descent for L1-regularized logistic regression'. Together they form a unique fingerprint.

Cite this