15th European Conference on Artificial Intelligence
|July 21-26 2002 Lyon France|
Botond Szatmary, Barnabas Poczos, Julian Eggert, Edgar Koerner, Andras Lorincz
Properties of a novel algorithm called non-negative matrix factorization (NMF), are studied. NMF can discover substructures and can provide estimations about the presence or the absence of those, being attractive for completion of missing information. We have studied the working and learning capabilities of NMF networks. Performance was improved by adding sparse code shrinkage (SCS) algorithm to remove structureless noise. We have found that NMF performance is considerably improved by SCS noise filtering. For improving noise resistance in the learning phase, weight sparsification was studied; a sparsifying prior was applied on the NMF weight matrix. Learning capability versus noise content was measured with and without sparsifying prior. In accordance with observation made by others on independent component analysis, we have also found that weight sparsification improved learning capabilities in the presence of Gaussian noise.
Keywords: Neural Networks, Machine Learning
Citation: Botond Szatmary, Barnabas Poczos, Julian Eggert, Edgar Koerner, Andras Lorincz: Non-negative matrix factorization extended by sparse code shrinkage and weight sparsification non-negative matrix factorization algorithms. In F. van Harmelen (ed.): ECAI2002, Proceedings of the 15th European Conference on Artificial Intelligence, IOS Press, Amsterdam, 2002, pp.503-507.