Regularized estimation of large-scale gene association networks using graphical Gaussian models
Open Access
- 24 November 2009
- journal article
- Published by Springer Science and Business Media LLC in BMC Bioinformatics
- Vol. 10 (1), 384
- https://doi.org/10.1186/1471-2105-10-384
Abstract
Graphical Gaussian models are popular tools for the estimation of (undirected) gene association networks from microarray data. A key issue when the number of variables greatly exceeds the number of samples is the estimation of the matrix of partial correlations. Since the (Moore-Penrose) inverse of the sample covariance matrix leads to poor estimates in this scenario, standard methods are inappropriate and adequate regularization techniques are needed. Popular approaches include biased estimates of the covariance matrix and high-dimensional regression schemes, such as the Lasso and Partial Least Squares. In this article, we investigate a general framework for combining regularized regression methods with the estimation of Graphical Gaussian models. This framework includes various existing methods as well as two new approaches based on ridge regression and adaptive lasso, respectively. These methods are extensively compared both qualitatively and quantitatively within a simulation study and through an application to six diverse real data sets. In addition, all proposed algorithms are implemented in the R package "parcor", available from the R repository CRAN. In our simulation studies, the investigated non-sparse regression methods, i.e. Ridge Regression and Partial Least Squares, exhibit rather conservative behavior when combined with (local) false discovery rate multiple testing in order to decide whether or not an edge is present in the network. For networks with higher densities, the difference in performance of the methods decreases. For sparse networks, we confirm the Lasso's well known tendency towards selecting too many edges, whereas the two-stage adaptive Lasso is an interesting alternative that provides sparser solutions. In our simulations, both sparse and non-sparse methods are able to reconstruct networks with cluster structures. On six real data sets, we also clearly distinguish the results obtained using the non-sparse methods and those obtained using the sparse methods where specification of the regularization parameter automatically means model selection. In five out of six data sets, Partial Least Squares selects very dense networks. Furthermore, for data that violate the assumption of uncorrelated observations (due to replications), the Lasso and the adaptive Lasso yield very complex structures, indicating that they might not be suited under these conditions. The shrinkage approach is more stable than the regression based approaches when using subsampling.Keywords
Other Versions
This publication has 43 references indexed in Scilit:
- Sparse Partial Least Squares Regression for Simultaneous Dimension Reduction and Variable SelectionJournal of the Royal Statistical Society Series B: Statistical Methodology, 2010
- Stability and aggregation of ranked gene listsBriefings in Bioinformatics, 2009
- Covariance-Regularized Regression and Classification for high Dimensional ProblemsJournal of the Royal Statistical Society Series B: Statistical Methodology, 2009
- A unified approach to false discovery rate estimationBMC Bioinformatics, 2008
- Identification of Genetically Mediated Cortical Networks: A Multivariate Study of Pediatric Twins and SiblingsCerebral Cortex, 2008
- An Arabidopsis gene network based on the graphical Gaussian modelGenome Research, 2007
- A review of feature selection techniques in bioinformaticsBioinformatics, 2007
- A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional GenomicsStatistical Applications in Genetics and Molecular Biology, 2005
- Diurnal Changes in the Transcriptome Encoding Enzymes of Starch Metabolism Provide Evidence for Both Transcriptional and Posttranscriptional Regulation of Starch Metabolism in Arabidopsis LeavesPlant Physiology, 2004
- Ridge Regression: Biased Estimation for Nonorthogonal ProblemsTechnometrics, 2000