{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2023,9,13]],"date-time":"2023-09-13T18:13:08Z","timestamp":1694628788316},"reference-count":15,"publisher":"Wiley","issue":"6","license":[{"start":{"date-parts":[[2013,12,11]],"date-time":"2013-12-11T00:00:00Z","timestamp":1386720000000},"content-version":"vor","delay-in-days":10,"URL":"http:\/\/onlinelibrary.wiley.com\/termsAndConditions#vor"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Statistical Analysis"],"published-print":{"date-parts":[[2013,12]]},"abstract":"Abstract<\/jats:title>High dimensional data are nowadays encountered in various branches of science. Variable selection techniques play a key role in analyzing high dimensional data. Generally two approaches for variable selection in the high dimensional data setting are considered\u2014forward selection methods and penalization methods. In the former, variables are introduced in the model one at a time depending on their ability to explain variation and the procedure is terminated at some stage following some stopping rule. In penalization techniques such as the least absolute selection and shrinkage operator (LASSO), as optimization procedure is carried out with an added carefully chosen penalty function, so that the solutions have a sparse structure. Recently, the idea of penalized forward selection has been introduced. The motivation comes from the fact that the penalization techniques like the LASSO give rise to closed form expressions when used in one dimension, just like the least squares estimator. Hence one can repeat such a procedure in a forward selection setting until it converges. The resulting procedure selects sparser models than comparable methods without compromising on predictive power. However, when the regressor is high dimensional, it is typical that many predictors are highly correlated. We show that in such situations, it is possible to improve stability and computational efficiency of the procedure further by introducing an orthogonalization step. At each selection step, variables potentially available to be selected in the model are screened on the basis of their correlation with variables already in the model, thus preventing unnecessary duplication. The new strategy, called the Selection Technique in Orthogonalized Regression Models (STORM), turns out to be extremely successful in reducing the model dimension further and also leads to improved predicting power. We also consider an aggressive version of the STORM, where a potential predictor will be permanently removed from further consideration if its regression coefficient is estimated as zero at any stage. We shall carry out a detailed simulation study to compare the newly proposed method with existing ones and analyze a real dataset. \u00a9 2013 Wiley Periodicals, Inc. Statistical Analysis and Data Mining, 2013<\/jats:p>","DOI":"10.1002\/sam.11212","type":"journal-article","created":{"date-parts":[[2013,12,11]],"date-time":"2013-12-11T18:08:09Z","timestamp":1386785289000},"page":"557-564","source":"Crossref","is-referenced-by-count":3,"title":["Iterative selection using orthogonal regression techniques"],"prefix":"10.1002","volume":"6","author":[{"given":"Bradley","family":"Turnbull","sequence":"first","affiliation":[]},{"given":"Subhashis","family":"Ghosal","sequence":"additional","affiliation":[]},{"given":"Hao Helen","family":"Zhang","sequence":"additional","affiliation":[]}],"member":"311","published-online":{"date-parts":[[2013,12,11]]},"reference":[{"key":"e_1_2_6_2_2","volume-title":"Applied Regression Analysis","author":"Rawlings J.","year":"2001"},{"key":"e_1_2_6_3_2","doi-asserted-by":"publisher","DOI":"10.1214\/009053606000000092"},{"key":"e_1_2_6_4_2","doi-asserted-by":"publisher","DOI":"10.1198\/jasa.2008.tm08516"},{"key":"e_1_2_6_5_2","doi-asserted-by":"publisher","DOI":"10.1080\/00401706.1970.10488634"},{"key":"e_1_2_6_6_2","doi-asserted-by":"publisher","DOI":"10.1080\/00401706.1995.10484371"},{"key":"e_1_2_6_7_2","first-page":"147","article-title":"Regression shrinkage and selection via the Lasso","volume":"58","author":"Tibshirani R.","year":"1996","journal-title":"J R Stat Soc"},{"key":"e_1_2_6_8_2","doi-asserted-by":"publisher","DOI":"10.1198\/016214501753382273"},{"key":"e_1_2_6_9_2","doi-asserted-by":"publisher","DOI":"10.1198\/016214506000000735"},{"key":"e_1_2_6_10_2","doi-asserted-by":"crossref","first-page":"301","DOI":"10.1111\/j.1467-9868.2005.00503.x","article-title":"\u201cRegularization and variable selection via the elastic net\u201d","volume":"67","author":"Zou H.","year":"2005","journal-title":"J R Stat Soc"},{"key":"e_1_2_6_11_2","doi-asserted-by":"publisher","DOI":"10.1080\/01621459.1993.10476353"},{"key":"e_1_2_6_12_2","doi-asserted-by":"publisher","DOI":"10.1214\/009053604000001147"},{"key":"e_1_2_6_13_2","doi-asserted-by":"publisher","DOI":"10.1198\/016214507000000121"},{"key":"e_1_2_6_14_2","doi-asserted-by":"publisher","DOI":"10.4310\/SII.2009.v2.n3.a7"},{"key":"e_1_2_6_15_2","first-page":"407","article-title":"Least angle regression","volume":"24","author":"Efron B.","year":"2004","journal-title":"Ann Stat"},{"key":"e_1_2_6_16_2","first-page":"1603","article-title":"Adaptive Lasso for sparse high\u2010dimensional regression models","volume":"18","author":"Huang J.","year":"2009","journal-title":"Stat Sin"}],"container-title":["Statistical Analysis and Data Mining: The ASA Data Science Journal"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/api.wiley.com\/onlinelibrary\/tdm\/v1\/articles\/10.1002%2Fsam.11212","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/api.wiley.com\/onlinelibrary\/tdm\/v1\/articles\/10.1002%2Fsam.11212","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/onlinelibrary.wiley.com\/doi\/pdf\/10.1002\/sam.11212","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,9,12]],"date-time":"2023-09-12T05:34:26Z","timestamp":1694496866000},"score":1,"resource":{"primary":{"URL":"https:\/\/onlinelibrary.wiley.com\/doi\/10.1002\/sam.11212"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2013,12]]},"references-count":15,"journal-issue":{"issue":"6","published-print":{"date-parts":[[2013,12]]}},"alternative-id":["10.1002\/sam.11212"],"URL":"http:\/\/dx.doi.org\/10.1002\/sam.11212","archive":["Portico"],"relation":{},"ISSN":["1932-1864","1932-1872"],"issn-type":[{"value":"1932-1864","type":"print"},{"value":"1932-1872","type":"electronic"}],"subject":["Computer Science Applications","Information Systems","Analysis"],"published":{"date-parts":[[2013,12]]}}}