{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,5]],"date-time":"2026-02-05T07:56:57Z","timestamp":1770278217347,"version":"3.49.0"},"reference-count":20,"publisher":"SAGE Publications","issue":"1","license":[{"start":{"date-parts":[[2018,7,28]],"date-time":"2018-07-28T00:00:00Z","timestamp":1532736000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"published-print":{"date-parts":[[2019,2,16]]},"abstract":"<jats:p>The choice of attribute significance is the most important step of attribute reduction algorithm. Information entropy is a method of calculating the importance of attributes. Due to the fact that information view only takes the size of knowledge granularity into account rather than measures the importance of attributes objectively and comprehensively this paper begins with putting forward the definition of approximate boundary accuracy based on algebra view. Afterwards, this paper proposes two concepts of relative information entropy and enhanced information entropy based on the definition of relative fuzzy entropy, which has obvious magnification effect. Then, two new methods of attribute reduction are proposed by incorporating approximate boundary precision into relative information entropy and enhanced information entropy, so that the choice of the importance of the attribute is more objective and comprehensive. Finally, it will analyze and compare the classification accuracy of each kind of algorithm by using the SVM classifier and ten-fold crossover method, and analyze the influence of outliers on the effect of the algorithm. Through experimental analysis and comparison, it can be concluded that the attribute reduction based on improved entropy is feasible and effective.<\/jats:p>","DOI":"10.3233\/jifs-171989","type":"journal-article","created":{"date-parts":[[2018,7,31]],"date-time":"2018-07-31T18:14:03Z","timestamp":1533060843000},"page":"709-718","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":10,"title":["Attribute reduction based on improved information entropy"],"prefix":"10.1177","volume":"36","author":[{"given":"Baohua","family":"Liang","sequence":"first","affiliation":[{"name":"Institute of Information Engineering, Chaohu University, Hefei, Anhui, China"}]},{"given":"Lin","family":"Wang","sequence":"additional","affiliation":[{"name":"Institute of Foreign Language, Chaohu University, Hefei, Anhui, China"}]},{"given":"Yong","family":"Liu","sequence":"additional","affiliation":[{"name":"Institute of Information Engineering, Chaohu University, Hefei, Anhui, China"}]}],"member":"179","published-online":{"date-parts":[[2018,7,28]]},"reference":[{"key":"e_1_3_2_2_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.ins.2006.06.006"},{"key":"e_1_3_2_3_2","doi-asserted-by":"publisher","DOI":"10.1007\/BF01001956"},{"issue":"11","key":"e_1_3_2_4_2","first-page":"976","article-title":"Rules acquisition and attribute reduction of ordered formal decision contexts","volume":"29","author":"Zhang J.","year":"2016","unstructured":"J.Zhang and L.Wei, Rules acquisition and attribute reduction of ordered formal decision contexts, PR AI 29(11) (2016), 976\u2013984.","journal-title":"PR AI"},{"key":"e_1_3_2_5_2","doi-asserted-by":"publisher","DOI":"10.1007\/s10479-013-1352-1"},{"key":"e_1_3_2_6_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNB.2011.2168535"},{"issue":"4","key":"e_1_3_2_7_2","first-page":"1101","article-title":"Research of aircraft generator fault diagnostic decision based on attribute reduction in variable precision rough sets","volume":"34","author":"Tao J.","year":"2017","unstructured":"J.Tao, W.Jialin, S.Xudong and C.Houhe, Research of aircraft generator fault diagnostic decision based on attribute reduction in variable precision rough sets, Appl Res Comput 34(4) (2017), 1101\u20131104.","journal-title":"Appl Res Comput"},{"issue":"2","key":"e_1_3_2_8_2","first-page":"423","article-title":"Goal mine safety risk prediction by RS-SVM combined model","volume":"46","author":"Ying W.","year":"2017","unstructured":"W.Ying and J.Gaopen, Goal mine safety risk prediction by RS-SVM combined model, J China Univ Mining Technol 46(2) (2017), 423\u2013429.","journal-title":"J China Univ Mining Technol"},{"issue":"9","key":"e_1_3_2_9_2","first-page":"795","article-title":"Fast attribute reduction algorithm based on row storage","volume":"8","author":"Baohua L.","year":"2015","unstructured":"L.Baohua and W.Shiyi, Fast attribute reduction algorithm based on row storage, Pattern Recogn ArtifIntell 8(9) (2015), 795\u2013801.","journal-title":"Pattern Recogn ArtifIntell"},{"issue":"1","key":"e_1_3_2_10_2","first-page":"107","article-title":"Attribute reduction method based on enhanced positive region","volume":"34","author":"Bowen S.","year":"2017","unstructured":"S.Bowen, L.Guohe, W.Weijiang, et al., Attribute reduction method based on enhanced positive region, Appl Res Comput 34(1) (2017), 107\u2013109.","journal-title":"Appl Res Comput"},{"issue":"8","key":"e_1_3_2_11_2","first-page":"183","article-title":"Incremental algorithm for attribute reduction based on positive region and dis-cernibility element","volume":"42","author":"Taotao L.","year":"2016","unstructured":"L.Taotao, M.Fumin and Z.Tengfei, Incremental algorithm for attribute reduction based on positive region and dis-cernibility element, Comput. Eng 42(8) (2016), 183\u2013187, 193.","journal-title":"Comput. Eng"},{"issue":"6","key":"e_1_3_2_12_2","first-page":"251","article-title":"Incremental updating algorithm for attribute reduction based on improved discernibility matrix","volume":"42","author":"Hao L.","year":"2015","unstructured":"L.Hao and X.Chao, Incremental updating algorithm for attribute reduction based on improved discernibility matrix, Comput Sci 42(6) (2015), 251\u2013255.","journal-title":"Comput Sci"},{"issue":"8","key":"e_1_3_2_13_2","first-page":"1531","article-title":"Attribute reduction with rough set based on dis-cernibility information tree","volume":"30","author":"Yu J.","year":"2015","unstructured":"J.Yu, Attribute reduction with rough set based on dis-cernibility information tree, Control Dec 30(8) (2015), 1531\u20131536.","journal-title":"Control Dec"},{"issue":"3","key":"e_1_3_2_14_2","doi-asserted-by":"crossref","first-page":"373","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"The mathematical theory of communication","volume":"27","author":"Shannon C.E.","year":"1948","unstructured":"C.E.Shannon, The mathematical theory of communication, Bell Syst Tech J 27(3-4) (1948), 373-423.","journal-title":"Bell Syst Tech J"},{"issue":"7","key":"e_1_3_2_15_2","first-page":"759","article-title":"Decision table reduction based on conditional information entropy","volume":"25","author":"Guoying W.","year":"2002","unstructured":"W.Guoying, Y.Hong and Y.Dachun, Decision table reduction based on conditional information entropy, Chin J Comput 25(7) (2002), 759\u2013776.","journal-title":"Chin J Comput"},{"issue":"11","key":"e_1_3_2_16_2","first-page":"2156","article-title":"Approximate reduction based on conditional information entropy in decision tables","volume":"35","author":"Ming Y.","year":"2007","unstructured":"Y.Ming, Approximate reduction based on conditional information entropy in decision tables, Acta Electron Sin 35(11) (2007), 2156\u20132160.","journal-title":"Acta Electron Sin"},{"issue":"4","key":"e_1_3_2_17_2","first-page":"359","article-title":"New attribute reduction on information entropy","volume":"7","author":"Qinghua Z.","year":"2013","unstructured":"Z.Qinghua and X.Yu, New attribute reduction on information entropy, J Front Comput Sci Technol 7(4) (2013), 359\u2013367.","journal-title":"J Front Comput Sci Technol"},{"issue":"1","key":"e_1_3_2_18_2","first-page":"65","article-title":"Attribute reduction based on approximation decision entropy","volume":"30","author":"Feng J.","year":"2015","unstructured":"J.Feng, W.Shasha, D.Junwei, et.al., Attribute reduction based on approximation decision entropy, Control Decision 30(1) (2015), 65\u201370.","journal-title":"Control Decision"},{"key":"e_1_3_2_19_2","doi-asserted-by":"publisher","DOI":"10.1007\/s00521-013-1368-0"},{"issue":"9","key":"e_1_3_2_20_2","first-page":"597","article-title":"Positive approximation: An accelerator for attribute reduction in rough set theory","volume":"174","author":"Yuhua Q.","year":"2010","unstructured":"Q.Yuhua, L.Jiye, W.Pedrycz, et al., Positive approximation: An accelerator for attribute reduction in rough set theory, ArtifIntell 174(9) (2010), 597\u2013618.","journal-title":"ArtifIntell"},{"issue":"2","key":"e_1_3_2_21_2","first-page":"207","article-title":"Fast algorithm for attribute reduction based on bucket sort","volume":"26","author":"Yu J.","year":"2011","unstructured":"J.Yu, L.Yintian and L.Chao, Fast algorithm for attribute reduction based on bucket sort, Control Decision 26(2) (2011), 207\u2013212.","journal-title":"Control Decision"}],"container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-171989","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.3233\/JIFS-171989","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-171989","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,4]],"date-time":"2026-02-04T18:20:55Z","timestamp":1770229255000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.3233\/JIFS-171989"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,7,28]]},"references-count":20,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2019,2,16]]}},"alternative-id":["10.3233\/JIFS-171989"],"URL":"https:\/\/doi.org\/10.3233\/jifs-171989","relation":{},"ISSN":["1064-1246","1875-8967"],"issn-type":[{"value":"1064-1246","type":"print"},{"value":"1875-8967","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,7,28]]}}}