Entropy-penalized semidefinite programming

Mikhail Krechetov, Jakub Mareček, Yury Maximov, Martin Takáč

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    4 Citations (Scopus)

    Abstract

    Low-rank methods for semi-definite programming (SDP) have gained a lot of interest recently, especially in machine learning applications. Their analysis often involves determinant-based or Schatten-norm penalties, which are difficult to implement in practice due to high computational efforts. In this paper, we propose Entropy-Penalized Semi-Definite Programming (EP-SDP)1, which provides a unified framework for a broad class of penalty functions used in practice to promote a low-rank solution. We show that EP-SDP problems admit an efficient numerical algorithm, having (almost) linear time complexity of the gradient computation; this makes it useful for many machine learning and optimization problems. We illustrate the practical efficiency of our approach on several combinatorial optimization and machine learning problems.

    Original languageEnglish
    Title of host publicationProceedings of the 28th International Joint Conference on Artificial Intelligence, IJCAI 2019
    EditorsSarit Kraus
    PublisherInternational Joint Conferences on Artificial Intelligence
    Pages1123-1129
    Number of pages7
    ISBN (Electronic)9780999241141
    DOIs
    Publication statusPublished - 2019
    Event28th International Joint Conference on Artificial Intelligence, IJCAI 2019 - Macao, China
    Duration: 10 Aug 201916 Aug 2019

    Publication series

    NameIJCAI International Joint Conference on Artificial Intelligence
    Volume2019-August
    ISSN (Print)1045-0823

    Conference

    Conference28th International Joint Conference on Artificial Intelligence, IJCAI 2019
    Country/TerritoryChina
    CityMacao
    Period10/08/1916/08/19

    Fingerprint

    Dive into the research topics of 'Entropy-penalized semidefinite programming'. Together they form a unique fingerprint.

    Cite this