Deep ensembles for imbalanced classification

Nataliia Kozlovskaia, Alexey Zaytsev

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    11 Citations (Scopus)

    Abstract

    Most of the standard classification algorithms perform poorly when dealing with the case of imbalanced classes i.e. when there is a class to which the overwhelming majority of samples belong. There are many approaches that deal with this problem, among which SMOTE and SMOTE boosting, the common approach prefers overly simplistic models that lead to degradation of performance. Recent advances in statistical learning theory provide more adequate complexity penalties for weak classifiers, which stem from the Rademacher complexity terms in the ensemble generalization bounds. By adopting these advances and introducing a sample weight correction based on the classification margin at each iteration of boosting we get more precise models for imbalanced classification problems.

    Original languageEnglish
    Title of host publicationProceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
    EditorsXuewen Chen, Bo Luo, Feng Luo, Vasile Palade, M. Arif Wani
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages908-913
    Number of pages6
    ISBN (Electronic)9781538614174
    DOIs
    Publication statusPublished - 2017
    Event16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017 - Cancun, Mexico
    Duration: 18 Dec 201721 Dec 2017

    Publication series

    NameProceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
    Volume2017-December

    Conference

    Conference16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
    Country/TerritoryMexico
    CityCancun
    Period18/12/1721/12/17

    Keywords

    • deep boosting
    • imbalanced classification
    • smote

    Fingerprint

    Dive into the research topics of 'Deep ensembles for imbalanced classification'. Together they form a unique fingerprint.

    Cite this