{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,18]],"date-time":"2026-04-18T06:48:21Z","timestamp":1776494901298,"version":"3.51.2"},"reference-count":25,"publisher":"MDPI AG","issue":"7","license":[{"start":{"date-parts":[[2022,7,11]],"date-time":"2022-07-11T00:00:00Z","timestamp":1657497600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"the Humanities and Social Science Research Project of Hebei Education Department","award":["SQ201110"],"award-info":[{"award-number":["SQ201110"]}]},{"name":"the Humanities and Social Science Research Project of Hebei Education Department","award":["KJCXTD-2022-02"],"award-info":[{"award-number":["KJCXTD-2022-02"]}]},{"name":"the Humanities and Social Science Research Project of Hebei Education Department","award":["QN202139"],"award-info":[{"award-number":["QN202139"]}]},{"name":"the Humanities and Social Science Research Project of Hebei Education Department","award":["22557688D"],"award-info":[{"award-number":["22557688D"]}]},{"name":"Hebei GEO University Science and Technology Innovation Team","award":["SQ201110"],"award-info":[{"award-number":["SQ201110"]}]},{"name":"Hebei GEO University Science and Technology Innovation Team","award":["KJCXTD-2022-02"],"award-info":[{"award-number":["KJCXTD-2022-02"]}]},{"name":"Hebei GEO University Science and Technology Innovation Team","award":["QN202139"],"award-info":[{"award-number":["QN202139"]}]},{"name":"Hebei GEO University Science and Technology Innovation Team","award":["22557688D"],"award-info":[{"award-number":["22557688D"]}]},{"name":"Basic scientific research Funds of Universities in Hebei Province","award":["SQ201110"],"award-info":[{"award-number":["SQ201110"]}]},{"name":"Basic scientific research Funds of Universities in Hebei Province","award":["KJCXTD-2022-02"],"award-info":[{"award-number":["KJCXTD-2022-02"]}]},{"name":"Basic scientific research Funds of Universities in Hebei Province","award":["QN202139"],"award-info":[{"award-number":["QN202139"]}]},{"name":"Basic scientific research Funds of Universities in Hebei Province","award":["22557688D"],"award-info":[{"award-number":["22557688D"]}]},{"name":"S&amp;T Program of Hebei","award":["SQ201110"],"award-info":[{"award-number":["SQ201110"]}]},{"name":"S&amp;T Program of Hebei","award":["KJCXTD-2022-02"],"award-info":[{"award-number":["KJCXTD-2022-02"]}]},{"name":"S&amp;T Program of Hebei","award":["QN202139"],"award-info":[{"award-number":["QN202139"]}]},{"name":"S&amp;T Program of Hebei","award":["22557688D"],"award-info":[{"award-number":["22557688D"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Symmetry"],"abstract":"<jats:p>This article introduces a novel two-stage variable selection method to solve the common asymmetry problem between the response variable and its influencing factors. In practical applications, we cannot correctly extract important factors from a large amount of complex and redundant data. However, the proposed method based on the relaxed lasso and the adaptive lasso, namely, the relaxed adaptive lasso, can achieve information symmetry because the variables it selects contain all the important information about the response variables. The goal of this paper is to preserve the relaxed lasso\u2019s superior variable selection speed while imposing varying penalties on different coefficients. Additionally, the proposed method enjoys favorable asymptotic properties, that is, consistency with a fast rate of convergence with Opn\u22121. The simulation demonstrates that the proper variable recovery, i.e., the number of significant variables selected, and prediction accuracy of the relaxed adaptive lasso in a limited sample is superior to the regular lasso, relaxed lasso and adaptive lasso estimators.<\/jats:p>","DOI":"10.3390\/sym14071422","type":"journal-article","created":{"date-parts":[[2022,7,12]],"date-time":"2022-07-12T03:50:36Z","timestamp":1657597836000},"page":"1422","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":5,"title":["Relaxed Adaptive Lasso and Its Asymptotic Results"],"prefix":"10.3390","volume":"14","author":[{"given":"Rufei","family":"Zhang","sequence":"first","affiliation":[{"name":"College of Economics, Hebei GEO University, Shijiazhuang 050031, China"},{"name":"Reaserch Center of Nutural Resources Assets, Hebei GEO University, Shijiazhuang 050031, China"},{"name":"Hebei Province Mineral Resources Development and Management and the Transformation and Upgrading of Resources Industry Soft Science Resrarch Base, Shijiazhuang 050031, China"}]},{"given":"Tong","family":"Zhao","sequence":"additional","affiliation":[{"name":"College of Economics, Hebei GEO University, Shijiazhuang 050031, China"}]},{"given":"Yajun","family":"Lu","sequence":"additional","affiliation":[{"name":"College of Economics, Hebei GEO University, Shijiazhuang 050031, China"},{"name":"Xiaomi Corporation, Beijing 100089, China"}]},{"given":"Xieting","family":"Xu","sequence":"additional","affiliation":[{"name":"College of Economics, Hebei GEO University, Shijiazhuang 050031, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,7,11]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"267","DOI":"10.1111\/j.2517-6161.1996.tb02080.x","article-title":"Regression shrinkage and selection via the lasso","volume":"58","author":"Tibshirani","year":"1996","journal-title":"J. R. Stat. Soc. Ser. B (Methodol.)"},{"key":"ref_2","unstructured":"Wang, S., Weng, H., and Maleki, A. (2017). Which bridge estimator is optimal for variable selection?. arXiv."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1436","DOI":"10.1214\/009053606000000281","article-title":"High-dimensional graphs and variable selection with the lasso","volume":"34","author":"Meinshausen","year":"2006","journal-title":"Ann. Stat."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1348","DOI":"10.1198\/016214501753382273","article-title":"Variable selection via nonconcave penalized likelihood and its oracle properties","volume":"96","author":"Fan","year":"2001","journal-title":"J. Am. Stat. Assoc."},{"key":"ref_5","unstructured":"Fan, J., and Li, R. (2006). Statistical challenges with high dimensionality: Feature selection in knowledge discovery. arXiv."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1418","DOI":"10.1198\/016214506000000735","article-title":"The adaptive lasso and its oracle properties","volume":"101","author":"Zou","year":"2006","journal-title":"J. Am. Stat. Assoc."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"928","DOI":"10.1214\/009053604000000256","article-title":"Nonconcave penalized likelihood with a diverging number of parameters","volume":"32","author":"Fan","year":"2004","journal-title":"Ann. Stat."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"425","DOI":"10.1093\/biomet\/81.3.425","article-title":"Ideal spatial adaptation by wavelet shrinkage","volume":"81","author":"Donoho","year":"1994","journal-title":"Biometrika"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"373","DOI":"10.1080\/00401706.1995.10484371","article-title":"Better subset regression using the nonnegative garrote","volume":"37","author":"Breiman","year":"1995","journal-title":"Technometrics"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"143","DOI":"10.1111\/j.1467-9868.2007.00581.x","article-title":"On the non-negative garrotte estimator","volume":"69","author":"Yuan","year":"2007","journal-title":"J. R. Stat. Soc. Ser. B (Stat. Methodol.)"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"374","DOI":"10.1016\/j.csda.2006.12.019","article-title":"Relaxed lasso","volume":"52","author":"Meinshausen","year":"2007","journal-title":"Comput. Stat. Data Anal."},{"key":"ref_12","unstructured":"Hastie, T., Tibshirani, R., and Tibshirani, R.J. (2017). Extended comparisons of best subset selection, forward stepwise selection, and the lasso. arXiv."},{"key":"ref_13","unstructured":"Mentch, L., and Zhou, S. (2019). Randomization as regularization: A degrees of freedom explanation for random forest success. arXiv."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"643","DOI":"10.1007\/s10888-021-09495-6","article-title":"Estimating intergenerational income mobility on sub-optimal data: A machine learning approach","volume":"19","author":"Bloise","year":"2021","journal-title":"J. Econ. Inequal."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"693","DOI":"10.4236\/jamp.2017.53058","article-title":"The Analysis of Impact Factors of Foreign Investment Based on Relaxed Lasso","volume":"5","author":"He","year":"2017","journal-title":"J. Appl. Math. Phys."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"77","DOI":"10.1016\/j.jtbi.2018.12.010","article-title":"Feature selection and tumor classification for microarray data using relaxed Lasso and generalized multi-class support vector machine","volume":"463","author":"Kang","year":"2019","journal-title":"J. Theor. Biol."},{"key":"ref_17","unstructured":"Tay, J.K., Narasimhan, B., and Hastie, T. (2021). Elastic net regularization paths for all generalized linear models. arXiv."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"1356","DOI":"10.1214\/aos\/1015957397","article-title":"Asymptotics for lasso-type estimators","volume":"28","author":"Fu","year":"2000","journal-title":"Ann. Stat."},{"key":"ref_19","first-page":"2541","article-title":"On model selection consistency of Lasso","volume":"7","author":"Zhao","year":"2006","journal-title":"J. Mach. Learn. Res."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"407","DOI":"10.1214\/009053604000000067","article-title":"Least angle regression","volume":"32","author":"Efron","year":"2004","journal-title":"Ann. Stat."},{"key":"ref_21","unstructured":"Huang, J., Ma, S., and Zhang, C.H. (2008). Adaptive Lasso for sparse high-dimensional regression models. Stat. Sin., 1603\u20131618."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"83","DOI":"10.1007\/BF02985802","article-title":"The elements of statistical learning: Data mining, inference and prediction","volume":"27","author":"Franklin","year":"2005","journal-title":"Math. Intell."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"McCullagh, P., and Nelder, J.A. (2019). Generalized Linear Models, Routledge.","DOI":"10.1201\/9780203753736"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"849","DOI":"10.1111\/j.1467-9868.2008.00674.x","article-title":"Sure independence screening for ultrahigh dimensional feature space","volume":"70","author":"Fan","year":"2008","journal-title":"J. R. Stat. Soc. Ser. B (Stat. Methodol.)"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"1129","DOI":"10.1080\/01621459.2012.695654","article-title":"Feature screening via distance correlation learning","volume":"107","author":"Li","year":"2012","journal-title":"J. Am. Stat. Assoc."}],"container-title":["Symmetry"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-8994\/14\/7\/1422\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T23:47:53Z","timestamp":1760140073000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-8994\/14\/7\/1422"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,7,11]]},"references-count":25,"journal-issue":{"issue":"7","published-online":{"date-parts":[[2022,7]]}},"alternative-id":["sym14071422"],"URL":"https:\/\/doi.org\/10.3390\/sym14071422","relation":{},"ISSN":["2073-8994"],"issn-type":[{"value":"2073-8994","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,7,11]]}}}