Rice University logo
 
Top blue bar image
50th Anniversary SRC: June 1-4, 2014 Galveston, Texas
 

Speaker: Jianhua Huang

Jianhua Huang

Two-way Regularized Matrix Decomposition
Matrix decomposition (or low-rank matrix approximation) plays an important role in various statistical learning problems. Regularization has been introduced to matrix decomposition to achieve stability, especially when the row or column dimension is high. When the both the row and column domains of the matrix are structured, it is natural to employ a two-way regularization penalty in low-rank matrix approximation. This talk gives a selected review of regularized matrix decomposition and discusses the importance of considering invariance when designing the two-way penalty and shows some un-desirable properties of the penalty used in the literature when the invariance is ignored.

Bio: Dr. Jianhua Huang received his B.S. degree in 1989 and M.S. degree in 1992, both in probability and statistics, from Beijing University, China; and his Ph.D. in statistics from University of California, Berkeley in 1997. He is currently Professor of Statistics at Texas A&M University. His research interests include computational statistics, semi- and non-parametric statistical methods, statistical machine learning, and applied statistics. He is a fellow of ASA and IMS.