The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. ex. Some numerals are expressed as "XNUMX".
Copyrights notice
The original paper is in English. Non-English content has been machine-translated and may contain typographical errors or mistranslations. Copyrights notice
NMF(Nonnegative Matrix Factorization) 및 NTF(Nonnegative Tensor Factorization)와 같은 확장은 BSS(블라인드 소스 분리), 이미지 데이터베이스 분석, 데이터 마이닝 및 기타 정보 검색 및 클러스터링 애플리케이션을 위한 탁월한 기술이 되었습니다. 이 논문에서 우리는 계산 신경 과학, 다감각 처리, 압축 감지 및 다차원 데이터 분석에 많은 잠재적인 응용이 가능한 NMF/NTF뿐만 아니라 희소 비음성 코딩 및 표현을 위한 효율적인 알고리즘 제품군을 제안합니다. 우리는 HALS(Hierarchical Alternating Least Squares) 알고리즘이라고 하는 최적화된 로컬 알고리즘 클래스를 개발했습니다. 이러한 목적을 위해 우리는 일련의 유클리드 거리 제곱에 대해 순차적으로 제한된 최소화를 수행했습니다. 그런 다음 이 접근 방식을 알파 및 베타 발산을 사용하여 강력한 비용 함수로 확장하고 유연한 업데이트 규칙을 도출합니다. 우리의 알고리즘은 국지적으로 안정적이며 과도하게 결정된 경우뿐만 아니라 과소 결정된(과도하게 완전한) 경우(예: 소스) 데이터가 충분히 희박한 경우. NMF 학습 규칙은 확장되고 일반화됩니다. N-차 비음수 텐서 분해(NTF). 또한 이러한 알고리즘은 단일 매개변수를 조정하여 다양한 소음 통계에 맞게 조정할 수 있습니다. 광범위한 실험 결과는 특히 다층 계층적 NMF 접근 방식을 사용하여 개발된 알고리즘의 정확성과 계산 성능을 확인합니다[3].
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
부
Andrzej CICHOCKI, Anh-Huy PHAN, "Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations" in IEICE TRANSACTIONS on Fundamentals,
vol. E92-A, no. 3, pp. 708-721, March 2009, doi: 10.1587/transfun.E92.A.708.
Abstract: Nonnegative matrix factorization (NMF) and its extensions such as Nonnegative Tensor Factorization (NTF) have become prominent techniques for blind sources separation (BSS), analysis of image databases, data mining and other information retrieval and clustering applications. In this paper we propose a family of efficient algorithms for NMF/NTF, as well as sparse nonnegative coding and representation, that has many potential applications in computational neuroscience, multi-sensory processing, compressed sensing and multidimensional data analysis. We have developed a class of optimized local algorithms which are referred to as Hierarchical Alternating Least Squares (HALS) algorithms. For these purposes, we have performed sequential constrained minimization on a set of squared Euclidean distances. We then extend this approach to robust cost functions using the alpha and beta divergences and derive flexible update rules. Our algorithms are locally stable and work well for NMF-based blind source separation (BSS) not only for the over-determined case but also for an under-determined (over-complete) case (i.e., for a system which has less sensors than sources) if data are sufficiently sparse. The NMF learning rules are extended and generalized for N-th order nonnegative tensor factorization (NTF). Moreover, these algorithms can be tuned to different noise statistics by adjusting a single parameter. Extensive experimental results confirm the accuracy and computational performance of the developed algorithms, especially, with usage of multi-layer hierarchical NMF approach [3].
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.E92.A.708/_p
부
@ARTICLE{e92-a_3_708,
author={Andrzej CICHOCKI, Anh-Huy PHAN, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations},
year={2009},
volume={E92-A},
number={3},
pages={708-721},
abstract={Nonnegative matrix factorization (NMF) and its extensions such as Nonnegative Tensor Factorization (NTF) have become prominent techniques for blind sources separation (BSS), analysis of image databases, data mining and other information retrieval and clustering applications. In this paper we propose a family of efficient algorithms for NMF/NTF, as well as sparse nonnegative coding and representation, that has many potential applications in computational neuroscience, multi-sensory processing, compressed sensing and multidimensional data analysis. We have developed a class of optimized local algorithms which are referred to as Hierarchical Alternating Least Squares (HALS) algorithms. For these purposes, we have performed sequential constrained minimization on a set of squared Euclidean distances. We then extend this approach to robust cost functions using the alpha and beta divergences and derive flexible update rules. Our algorithms are locally stable and work well for NMF-based blind source separation (BSS) not only for the over-determined case but also for an under-determined (over-complete) case (i.e., for a system which has less sensors than sources) if data are sufficiently sparse. The NMF learning rules are extended and generalized for N-th order nonnegative tensor factorization (NTF). Moreover, these algorithms can be tuned to different noise statistics by adjusting a single parameter. Extensive experimental results confirm the accuracy and computational performance of the developed algorithms, especially, with usage of multi-layer hierarchical NMF approach [3].},
keywords={},
doi={10.1587/transfun.E92.A.708},
ISSN={1745-1337},
month={March},}
부
TY - JOUR
TI - Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 708
EP - 721
AU - Andrzej CICHOCKI
AU - Anh-Huy PHAN
PY - 2009
DO - 10.1587/transfun.E92.A.708
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E92-A
IS - 3
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - March 2009
AB - Nonnegative matrix factorization (NMF) and its extensions such as Nonnegative Tensor Factorization (NTF) have become prominent techniques for blind sources separation (BSS), analysis of image databases, data mining and other information retrieval and clustering applications. In this paper we propose a family of efficient algorithms for NMF/NTF, as well as sparse nonnegative coding and representation, that has many potential applications in computational neuroscience, multi-sensory processing, compressed sensing and multidimensional data analysis. We have developed a class of optimized local algorithms which are referred to as Hierarchical Alternating Least Squares (HALS) algorithms. For these purposes, we have performed sequential constrained minimization on a set of squared Euclidean distances. We then extend this approach to robust cost functions using the alpha and beta divergences and derive flexible update rules. Our algorithms are locally stable and work well for NMF-based blind source separation (BSS) not only for the over-determined case but also for an under-determined (over-complete) case (i.e., for a system which has less sensors than sources) if data are sufficiently sparse. The NMF learning rules are extended and generalized for N-th order nonnegative tensor factorization (NTF). Moreover, these algorithms can be tuned to different noise statistics by adjusting a single parameter. Extensive experimental results confirm the accuracy and computational performance of the developed algorithms, especially, with usage of multi-layer hierarchical NMF approach [3].
ER -