In the setting where a data set in D consists of samples from a probability measure ρ concentrated on or near an unknown d-dimensional set M, with D large but d ≪ D, we consider two sets of problems: geometric approximation of M and regression of a function on M. In the first case we construct multiscale low-dimensional empirical approximations ofM, which are adaptive whenMhas geometric regularity that may vary at different locations and scales, and give performance guarantees. In the second case we exploit these empirical geometric approximations to construct multiscale approximations to on M, which adapt to the unknown regularity of even when this varies at different scales and locations. We prove guarantees showing that we attain the same learning rates as if was defined on a Euclidean domain of dimension d, instead of an unknown manifold M. All algorithms have complexity O(n log n), with constants scaling linearly in D and exponentially in d.

Liao, W., Maggioni, M., Vigogna, S. (2016). Learning adaptive multiscale approximations to data and functions near low-dimensional sets. In 2016 IEEE Information Theory Workshop (ITW) (pp.226-230). New York, USA : Institute of Electrical and Electronics Engineers [10.1109/ITW.2016.7606829].

Learning adaptive multiscale approximations to data and functions near low-dimensional sets

Vigogna S.
2016-01-01

Abstract

In the setting where a data set in D consists of samples from a probability measure ρ concentrated on or near an unknown d-dimensional set M, with D large but d ≪ D, we consider two sets of problems: geometric approximation of M and regression of a function on M. In the first case we construct multiscale low-dimensional empirical approximations ofM, which are adaptive whenMhas geometric regularity that may vary at different locations and scales, and give performance guarantees. In the second case we exploit these empirical geometric approximations to construct multiscale approximations to on M, which adapt to the unknown regularity of even when this varies at different scales and locations. We prove guarantees showing that we attain the same learning rates as if was defined on a Euclidean domain of dimension d, instead of an unknown manifold M. All algorithms have complexity O(n log n), with constants scaling linearly in D and exponentially in d.
2016 IEEE Information Theory Workshop (ITW)
Cambridge, UK
2016
Rilevanza internazionale
2016
Settore MAT/06
English
Intervento a convegno
Liao, W., Maggioni, M., Vigogna, S. (2016). Learning adaptive multiscale approximations to data and functions near low-dimensional sets. In 2016 IEEE Information Theory Workshop (ITW) (pp.226-230). New York, USA : Institute of Electrical and Electronics Engineers [10.1109/ITW.2016.7606829].
Liao, W; Maggioni, M; Vigogna, S
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2108/361045
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 6
social impact