Reference: | Brem, R., and Kruglyak, L. (2005). The landscape of genetic complexity across 5,700 gene expression traits in yeast. Proceedings of the National Academy of Sciences, 102, 1572–1577. Brown, B., Miller, C. J., and Wolfson, J. (2017). ThrEEBoost: Thresholded boosting for variable selection and prediction via estimating equations. Journal of Computational and Graphical Statistics, 26, 579–588. Cai, T. Liu, W., and Luo, X. (2011). A constrained ℓ1 minimization approach to sparse precision matrix estimation. Journal of the American Statistical Association, 106, 594–607. Cai, T. Liu, W., and Luo, X. (2011). Package clime: Constrained L1-Minimization for Inverse (Covariance) Matrix Estimation. https://CRAN.R-project.org/package=clime. Carroll, R. J., Ruppert, D., Stefanski, L. A., and Crainiceanu, C. M. (2006). Measurement Error in Nonlinear Model. CRC Press Chapman and Hall, Boca Raton. Chatterjee, S. (2021). A new coefficient of correlation. Journal of the American Statistical Association, 116, 2009–2022. Chen, L.-P. (2020). Variable selection and estimation for the additive hazards model subject to left-truncation, right-censoring and measurement error in covariates. Journal of Statistical Computation and Simulation, 90, 3261–3300. Chen, L.-P. (2021). Feature screening based on distance correlation for ultrahigh-dimensional censored data with covariate measurement error. Computational Statistics, 36. 857–884. Chen, L.-P. (2022). Network-based discriminant analysis for multiclassification. Journal of Classification, 39. 410–431. Chen, L.-P. and Yi, G. Y. (2020). Model selection and model averaging for analysis of truncated and censored data with measurement error. Electronic Journal of Statistics, 14, 4054–4109. Chen, L.-P. and Yi, G. Y. (2021a). Analysis of noisy survival data with graphical proportional hazards measurement error models. Biometrics, 77, 956–969. Chen, L.-P. and Yi, G. Y. (2021b). Semiparametric methods for left-truncated and right-censored survival data with covariate measurement error. Annals of the Institute of Statistical Mathematics, 73, 481–517. Chen, L.-P. and Yi, G. Y. (2022). De-noising analysis of noisy data under mixed graphical models. Electronic Journal of Statistics, 16, 3861–3909. Chen, L.-P. (2023a). Estimation of graphical models: An overview of selected topics. International Statistical Review, In press. Chen, L.-P. (2023b). A note of feature screening via a rank-based coefficient of correlation. Biometrical Journal, 65, 2100373. Chen, L.-P. (2023c). Variable selection and estimation for misclassified binary responses and multivariate error-prone predictors. Journal of Computational and Graphical Statistics, In press. Dalal, O. and Rajaratnam, B. (2017). Sparse Gaussian graphical model estimation via alternating minimization. Biometrika, 104, 379–395. Friedman, J., Hastie, T., and Tibshirani, R. (2008). Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9, 432–441. Friedman, J., Hastie, T., and Tibshirani, R. (2019). Package glasso: Graphical Lasso: Estimation of Gaussian Graphical Models. https://CRAN.R-project.org/package=glasso. Hossin, M., and Sulaiman, M. N. (2015). A review on evaluation metrics for data classification evaluations. International Journal of Data Mining and Knowledge Management process, 5, 1–11. Hsieh, C.-J., Matyas A. Sustik, M.A., Dhillon, I.S., and Ravikumar, P. (2014) Package QUIC: Regularized sparse inverse covariance matrix estimation. https://CRAN.Rproject.org/package=QUIC. Jankov´a, J., and van de Geer, S. (2018). Inference in high-dimensional graphical models. In Handbook of Graphical Models Edited By Marloes Maathuis, Mathias Drton, Steffen Lauritzen, Martin Wainwright, 325–349. CRC Press, Boca Raton. Khan, J., Wei, J. S., Ringner, M., Saal, L. H., Ladanyi, M., Westermann, F., Berthold, F., Schwab, M., Antonescu, C. R., Peterson, C., and Meltzer, P. S. (2001). Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks. Nature Medicine, 7, 673–679. Klaassen, S., Kueck, J., and Spindler, M. (2023). Uniform Inference in High-Dimensional Gaussian Graphical Models. Biometrika, 110, 51–68. Kullback, S. and Leibler, R. A. (1951). On information and sufficiency. The Annals of Accepted Article Mathematical Statistics, 22, 79–86. Lafferty, J., Liu, H., and Wasserman, L. (2012). Sparse nonparametric graphical models. Statistical Science, 27, 519–537. Li, T., Qian, C., Levina, E., and Zhu, J. (2020). High-dimensional gaussian graphical models on network-linked data. Journal of Machine Learning Research, 21, 1–45. Liang, S. and Liang, F. (2022). A double regression method for graphical modeling of highdimensional nonlinear and non-Gaussian data. Statistics and Its Interface, In press. Lin, L., Drton, M., and Shojaie, A. (2016). Estimation of high-dimensional graphical models using regularized score matching. Electronic Journal of Statistics, 10, 806–854. Liu, H., Han, F., Yuan, M., Lafferty, J.D., and Wasserman, L.A. (2012). High-dimensional semiparametric Gaussian copula graphical models. The Annals of Statistics, 40, 2293–2326. Liu, H., Lafferty, J.D., and Wasserman, L.A. (2009). The nonparanormal: semiparametric estimation of high dimensional undirected graphs. The Journal of Machine Learning Research, 10, 2295–2328. Liu, H. and Zhang, X. (2023). Frequentist model averaging for undirected Gaussian graphical models. Biometrics, 79, 2050–2062. Mazumder, R., and Hastie, T. (2012). Package dpglasso: Primal Graphical Lasso. https://CRAN.Rproject.org/package=dpglasso. Meinshausen, N. and B¨uhlmann, P. (2006). High-dimensional graphs and variable selection with the Lasso. The Annals of Statistics, 34, 1436–1462. Qiu, H., Han, F., Liu, H., and Caffo, B. (2016) Joint estimation of multiple graphical models from high dimensional time series. Journal of the Royal Statistical Society Series B: Statistical Methodology, 78, 487–504. Ravikumar, P., Wainwright, M. J., and Lafferty, J. (2010). High dimensional Ising model selection using ℓ1-regularized logistic regression. The Annals of Statistics, 38, 1287–1319. Ravikumar, P., Wainwright, M. J., Raskutti, G., and Yu, B. (2011). High-dimensional covariance estimation by minimizing ℓ1-penalized log determinant divergence. Electronic Journal of Statistics, 5, 935–980. Roy, A. and Dunson, D.B. (2020). Nonparametric graphical model for counts. Journal of Machine Learning Research, 21, 1–22. Shi, W., Ghosal, S., and Martin, R. (2021). Bayesian estimation of sparse precision matrices in the presence of Gaussian measurement error. Electronic Journal of Statistics, 15, 4545–4579. Sun, H. and Li, H. (2012). Robust Gaussian graphical modeling via ℓ1-penalization. Biometrics, 68, 1197–1206. Wainwright, M. J. (2019). High-Dimensional Statistics: A Non-Asymptotic Viewpoint. Cambridge University Press, Cambridge. Wan, Y.-W., Allen, G. I., Baker, Y., Yang, E., Ravikumar, P., and Liu, Z. (2015). Package XMRF: Markov Random Fields for High-Throughput Genetics Data. https://cran.rproject.org/web/packages/XMRF/. Wang, L., Chen, Z., Wang, C. D., and Li, R. (2020). Ultrahigh dimensional precision matrix estimation via refitted cross validation. Journal of Econometrics, 215, 118–130. Wolfson, J. (2011). EEBOOST: a general method for prediction and variables selection based on estimating equation. Journal of the American Statistical Association, 106, 295–305. Xue, L. and Zou, H. (2012). Regularized rank-based estimation of high-dimensional nonparanormal graphical models. The Annals of Statistics, 40, 2541–2571. Yang, Y., Dai, H., and Pan, J. (2023). Block-diagonal precision matrix regularization for ultra-high dimensional data. Computational Statistics and Data Analysis, 179, 107630. Yang, Z., Ning, Y., and Liu, H. (2018). On semiparametric exponential family graphical models. Journal of Machine Learning Research, 19, 1–59. Yi, G. Y. (2017). Statistical Analysis with Measurement Error and Misclassification: Strategy, Method and Application. Springer, New York. Yuan, M. and Lin, Y. (2007). Model selection and estimation in the Gaussian graphical model. Biometrika, 94, 19–35 . Zhao, T., Liu, H., Lafferty, J., and Wasserman, L. (2012). The huge package for highdimensional undirected graph estimation in R. Journal of Machine Learning Research, 13, 1059–1062. Zou, H. (2006) The adaptive Lasso and its oracle properties. Journal of the American Statistical Association, 101, 1418–1429 |