FCCC LOGO Faculty Publications
A statistical framework for non-negative matrix factorization based on generalized dual divergence
Neural Netw. 2021 Mar 26;140 :309-324
PMID: 33892302   
Back to previous list
Abstract
A statistical framework for non-negative matrix factorization based on generalized dual Kullback-Leibler divergence, which includes members of the exponential family of models, is proposed. A family of algorithms is developed using this framework, including under sparsity constraints, and its convergence proven using the Expectation-Maximization algorithm. The framework generalizes some existing methods for different noise structures and contrasts with the recently developed quasi-likelihood approach, thus providing a useful alternative for non-negative matrix factorization. A measure to evaluate the goodness-of-fit of the resulting factorization is described. The performance of the proposed methods is evaluated extensively using real life and simulated data and their utility in unsupervised and semi-supervised learning is illustrated using an application in cancer genomics. This framework can be viewed from the perspective of reinforcement learning, and can be adapted to incorporate discriminant functions and multi-layered neural networks within a deep learning paradigm.
Notes
1879-2782 Devarajan, Karthik Journal Article United States Neural Netw. 2021 Mar 26;140:309-324. doi: 10.1016/j.neunet.2021.03.020.