Asymptotic properties of local sampling on Manifold

Yury Aleksandrovich Yanovich

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)

Abstract

In many applications, the real high-dimensional data occupy only a very small part in the high dimensional 'observation space' whose intrinsic dimension is small. The most popular model of such data is Manifold model which assumes that the data lie on or near an unknown manifold Data Manifold, (DM) of lower dimensionality embedded in an ambient high-dimensional input space (Manifold assumption about highdimensional data). Manifold Learning is a Dimensionality Reduction problem under the Manifold assumption about the processed data and its goal is to construct a low-di-mensional parameterization of the DM (global low-dimensional coordinates on the DM) from a finite dataset sampled from the DM. Manifold assumption means that local neighborhood of each manifold point is equivalent to an area of low-dimensional Euclidean space. Because of this, most of Manifold Learning algorithms include two parts: 'local part' in which certain characteristics reflecting low-dimensional local structure of neighborhoods of all sample points are constructed and 'global part' in which global low-dimensional coordinates on the DM are constructed by solving certain convex optimization problem for specific cost function depending on the local characteristics. Statistical properties of 'local part' are closely connected with local sampling on the manifold, which is considered in the study.

Original languageEnglish
Pages (from-to)157-175
Number of pages19
JournalJournal of Mathematics and Statistics
Volume12
Issue number3
DOIs
Publication statusPublished - 2016
Externally publishedYes

Keywords

  • Asymptotic expansions
  • Large deviations
  • Manifold learning

Fingerprint

Dive into the research topics of 'Asymptotic properties of local sampling on Manifold'. Together they form a unique fingerprint.

Cite this