{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,5,28]],"date-time":"2024-05-28T20:07:46Z","timestamp":1716926866278},"reference-count":16,"publisher":"World Scientific Pub Co Pte Lt","issue":"01","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Int. J. Inf. Acquisition"],"published-print":{"date-parts":[[2004,3]]},"abstract":"<jats:p> The geometric topology of a point per event written in the higher dimensional \u03bc-space of data (e.g. 6W's: who, where, when, what, how, and why) can help in the design of information acquisition (IA) systems. Measurement intensity of each W's sensor, or the number of words used to describe a specific W's attribute, represents the length of each vector dimension. Then, N concurrent reports of the same event become a distribution set of N points scattered all over \u03bc-space. To discover the statistically independent components, an unsupervised or unbiased Artificial Neural Networks (ANN) methodology called Independent Component Analysis (ICA) can be used to reveal a new subspace called the feature space. The major and minor axes of the subspace correspond to highly precise and efficient combinations of old attributes (e.g. 2-D feature domains consisting of \"where-who-when\" and \"what-how-why\" could be good choices for Internet search indices). Thus, one realizes that the communication of an event is not just the address-where: but who and when are equally important attributes. In principle, the number of new sensors can be reduced (e.g. from 6 W's to 2 features), provided that they are physically realizable. In the combined space of 6N-dimensional \u0393-space, one point can represent all N concurrent measurements; the flow of these generates the event behavior in time. The time flow over the reduced 2N feature space generates invariant features called knowledge. <\/jats:p><jats:p> For surveillance against terrorists, legacy electrical power line communication (PLC) will offer a useful relay for the last mile of mobile communications for a Surveillance Sensor Web (SSW) employing ANN: there is no need for \"where\" addressing for switching because of smart coding and decoding of \"who-when.\" After reviewing Auto-Regression (AR), we generalize AR to a supervised ANN implementation of Principal Component Analysis (PCA) (Appendix A) learning toward unsupervised learning ANN for ICA (Appendix B). This is possible non-statistically because the classical-closed information theory (CIT) of the maximum Shannon entropy S of a closed system must be generalized for open brain information theory (BIT) having non-zero energy exchange E at the minimum Helmholtz free energy H = E - T <jats:sub> o <\/jats:sub> S at isothermal equilibrium ( T <jats:sub> o <\/jats:sub>=37\u00b0 C ). For such an open BIT system, we prove the Lyaponov convergence theorem. We compute the ICA features of image textures in order to measure the ICA classifier information content. <\/jats:p>","DOI":"10.1142\/s0219878904000082","type":"journal-article","created":{"date-parts":[[2004,4,30]],"date-time":"2004-04-30T10:44:06Z","timestamp":1083321846000},"page":"1-22","source":"Crossref","is-referenced-by-count":7,"title":["GEOMETRIC TOPOLOGY FOR INFORMATION ACQUISITION BY MEANS OF SMART SENSOR WEB"],"prefix":"10.1142","volume":"01","author":[{"given":"HAROLD","family":"SZU","sequence":"first","affiliation":[{"name":"George Washington University, Washington DC, USA"}]}],"member":"219","published-online":{"date-parts":[[2012,1,25]]},"reference":[{"key":"rf2","doi-asserted-by":"publisher","DOI":"10.1162\/neco.1995.7.6.1129"},{"key":"rf3","doi-asserted-by":"publisher","DOI":"10.1109\/83.242353"},{"key":"rf4","first-page":"6","volume":"1","author":"Chanyagorn P.","journal-title":"Neural Network Society Newsletters"},{"key":"rf7","first-page":"204","volume":"2","author":"Conners R. W.","journal-title":"IEEE Trans. Pattern Anal. and Machine Intell."},{"key":"rf9","doi-asserted-by":"publisher","DOI":"10.1109\/PROC.1979.11328"},{"key":"rf10","doi-asserted-by":"publisher","DOI":"10.1117\/12.59957"},{"key":"rf11","doi-asserted-by":"publisher","DOI":"10.1117\/12.59981"},{"key":"rf12","doi-asserted-by":"publisher","DOI":"10.1073\/pnas.1132164100"},{"key":"rf13","doi-asserted-by":"publisher","DOI":"10.1364\/AO.24.001426"},{"key":"rf15","volume-title":"Computational Intelligence","author":"Szu H.","year":"2003"},{"key":"rf17","doi-asserted-by":"publisher","DOI":"10.1016\/S0925-2312(02)00595-7"},{"key":"rf24","doi-asserted-by":"publisher","DOI":"10.1016\/S0893-6080(03)00132-1"},{"key":"rf26","doi-asserted-by":"publisher","DOI":"10.1117\/12.173205"},{"key":"rf27","doi-asserted-by":"publisher","DOI":"10.1016\/0893-6080(95)00051-8"},{"key":"rf29","doi-asserted-by":"publisher","DOI":"10.1016\/0736-5853(93)90026-Z"},{"key":"rf31","volume-title":"Statistical Mechanics","author":"Uhlenbeck G.","year":"1964"}],"container-title":["International Journal of Information Acquisition"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.worldscientific.com\/doi\/pdf\/10.1142\/S0219878904000082","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2019,8,6]],"date-time":"2019-08-06T22:45:55Z","timestamp":1565131555000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.worldscientific.com\/doi\/abs\/10.1142\/S0219878904000082"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2004,3]]},"references-count":16,"journal-issue":{"issue":"01","published-online":{"date-parts":[[2012,1,25]]},"published-print":{"date-parts":[[2004,3]]}},"alternative-id":["10.1142\/S0219878904000082"],"URL":"https:\/\/doi.org\/10.1142\/s0219878904000082","relation":{},"ISSN":["0219-8789","1793-6985"],"issn-type":[{"value":"0219-8789","type":"print"},{"value":"1793-6985","type":"electronic"}],"subject":[],"published":{"date-parts":[[2004,3]]}}}