An Augmented Lagrangian Method for Optimization Problems with Nonnegative Orthogonality Constraints

楊 宇初

(指導教員:佐藤 一宏/ 数理情報第5研究室

資料PDF(Yang.pdf
研究概要

Problem Settings
Optimization with nonnegative orthogonality constraints has been widely used to applications of machine learning and data science, e.g. orthogonal nonnegative matrix factorization (ONMF), sparse principal component analysis (PCA), nonnegative Laplacian embedding (NLE) and discriminative nonnegative spectral clustering (DNSC). Although it has numerous applications, the existing studies on it are limited. In this thesis, we propose an algorithm based on augmented Lagrangian method combined with proximal alternating minimization approach. Also, we prove the existence of at least one convergent subsequence that the algorithm generates. It is also proved that the subsolver algorithm is guaranteed to find a critical point given arbitrary tolerance. Numerical results on random data, compared to Riemannian Optimization approach, demonstrate the efficiencies of the proposed method.
修論の感想

This research helped me learn the way to conduct a serious research from beginning to end. I would like to extend my sincere gratitude to my supervisor, Lecturer Kazuhiro Sato, who has walked me through all the stages of the writing of this thesis and spending a lot of time giving instructive advice and useful suggestions.


>
ISTyくん