Seminar | September 27 | 11 a.m.-12:30 p.m. |  Zoom

 Xinyi Zhong, Yale

 Consortium for Data Analytics in Risk

When the dimension of data is comparable to or larger than the number of data samples, Principal Components Analysis (PCA) may exhibit problematic high-dimensional noise. In this work, we propose an Empirical Bayes PCA method that reduces this noise by estimating a joint prior distribution for the principal components. EB-PCA is based on the classical Kiefer-Wolfowitz nonparametric MLE for empirical Bayes estimation, distributional results derived from random matrix theory for the sample PCs, and iterative refinement using an Approximate Message Passing (AMP) algorithm. In theoretical "spiked" models, EB-PCA achieves Bayes-optimal estimation accuracy in the same settings as an oracle Bayes AMP procedure that knows the true priors. Empirically, EB-PCA significantly improves over PCA when there is strong prior structure, both in simulation and on quantitative benchmarks constructed from the 1000 Genomes Project and the International HapMap Project. An illustration is presented for analysis of gene expression data obtained by single-cell RNA-seq.

 leenders@berkeley.edu

 Wouter Leenders,  leenders@berkeley.edu,  510-

Event Date
-
Status
Happening As Scheduled
Primary Event Type
Seminar
Location
Zoom
Performers
Xinyi Zhong, Yale (Speaker - Featured)
Event ID
147879