{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T04:21:43Z","timestamp":1760242903426,"version":"build-2065373602"},"reference-count":50,"publisher":"MDPI AG","issue":"11","license":[{"start":{"date-parts":[[2016,11,17]],"date-time":"2016-11-17T00:00:00Z","timestamp":1479340800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100002428","name":"FWF","doi-asserted-by":"publisher","award":["J 3765"],"award-info":[{"award-number":["J 3765"]}],"id":[{"id":"10.13039\/501100002428","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>The information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input has a continuous distribution. For systems with infinite information loss, a relative measure is defined and shown to be related to R\u00e9nyi information dimension. As deterministic signal processing can only destroy information, it is important to know how this information loss affects the solution of inverse problems. Hence, we connect the probability of perfectly reconstructing the input to the information lost in the system via Fano-type bounds. The theoretical results are illustrated by example systems commonly used in discrete-time, nonlinear signal processing and communications.<\/jats:p>","DOI":"10.3390\/e18110410","type":"journal-article","created":{"date-parts":[[2016,11,17]],"date-time":"2016-11-17T10:11:30Z","timestamp":1479377490000},"page":"410","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["Information-Theoretic Analysis of Memoryless Deterministic Systems"],"prefix":"10.3390","volume":"18","author":[{"given":"Bernhard","family":"Geiger","sequence":"first","affiliation":[{"name":"Institute for Communications Engineering, Technical University of Munich, Munich 80290, Germany"}]},{"given":"Gernot","family":"Kubin","sequence":"additional","affiliation":[{"name":"Signal Processing and Speech Communication Laboratory, Graz University of Technology, Graz 8010, Austria"}]}],"member":"1968","published-online":{"date-parts":[[2016,11,17]]},"reference":[{"key":"ref_1","unstructured":"Oppenheim, A.V., and Schafer, R.W. (2010). Discrete-Time Signal Processing, Pearson. [3rd ed.]."},{"key":"ref_2","unstructured":"Khalil, H.K. (2000). Nonlinear Systems, Pearson. [3rd ed.]."},{"key":"ref_3","unstructured":"Manolakis, D.G., Ingle, V.K., and Kogon, S.M. (2005). Statistical and Adaptive Signal Processing, Artech House."},{"key":"ref_4","unstructured":"Papoulis, A., and Pillai, U.S. (2002). Probability, Random Variables and Stochastic Processes, McGraw Hill. [4th ed.]."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Gray, R.M. (1990). Entropy and Information Theory, Springer.","DOI":"10.1007\/978-1-4757-3982-4"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1261","DOI":"10.1109\/TIT.2005.844072","article-title":"Mutual Information and Minimum Mean-Square Error in Gaussian Channels","volume":"51","author":"Guo","year":"2005","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"2909","DOI":"10.1109\/78.726805","article-title":"Tight Upper Bound on the Gain of Linear and Nonlinear Predictors","volume":"46","author":"Bernhard","year":"1998","journal-title":"IEEE Trans. Signal Process."},{"key":"ref_8","unstructured":"Geiger, B.C., and Kubin, G. (March, January 29). On the Information Loss in Memoryless Systems: The Multivariate Case. Proceedings of the International Zurich Seminar on Communications (IZS), Zurich, Switzerland."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Geiger, B.C., and Kubin, G. (2012, January 3\u20138). Relative Information Loss in the PCA. Proceedings of the IEEE Information Theory Workshop (ITW), Lausanne, Switzerland.","DOI":"10.1109\/ITW.2012.6404738"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Geiger, B.C., Feldbauer, C., and Kubin, G. (2011, January 6\u20139). Information Loss in Static Nonlinearities. Proceedings of the IEEE International Symposium on Wireless Communication Systems (ISWSC), Aachen, Germany.","DOI":"10.1109\/ISWCS.2011.6125272"},{"key":"ref_11","unstructured":"Geiger, B.C. (2014). Information Loss in Deterministic Systems. [Ph.D. Thesis, Graz University of Technology]."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"684","DOI":"10.1109\/TIT.2004.840866","article-title":"The Average Amount of Information Lost in Multiplication","volume":"51","author":"Pippenger","year":"2005","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"248","DOI":"10.1016\/S0019-9958(60)90863-9","article-title":"Loss and Recovery of Information by Coarse Observation of Stochastic Chain","volume":"3","author":"Watanabe","year":"1960","journal-title":"Inf. Control"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Geiger, B.C., and Kubin, G. (2011, January 6\u20139). Some Results on the Information Loss in Dynamical Systems. Proceedings of the IEEE International Symposium on Wireless Communication Systems (ISWSC), Aachen, Germany.","DOI":"10.1109\/ISWCS.2011.6125271"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A Mathematical Theory of Communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"1326","DOI":"10.1016\/j.sigpro.2006.11.005","article-title":"Toward a Theory of Information Processing","volume":"87","author":"Johnson","year":"2007","journal-title":"Signal Process."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Kotz, S., and Johnson, N.L. (1992). Breakthroughs in Statistics: Foundations and Basic Theory, Springer.","DOI":"10.1007\/978-1-4612-0919-5"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"716","DOI":"10.1109\/TAC.1974.1100705","article-title":"A New Look at the Statistical Model Identification","volume":"19","author":"Akaike","year":"1974","journal-title":"IEEE Trans. Autom. Control"},{"key":"ref_19","unstructured":"Tishby, N., Pereira, F.C., and Bialek, W. (1999, January 22\u201324). The Information Bottleneck Method. Proceedings of the Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA."},{"key":"ref_20","unstructured":"Wohlmayr, M., Markaki, M., and Stylianou, Y. (2007, January 3\u20137). Speech-Nonspeech Discrimination based on Speech-Relevant Spectrogram Modulations. Proceedings of the European Signal Processing Conference (EUSIPCO), Poznan, Poland."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Yella, S., and Bourlard, H. (2014, January 4\u20139). Information Bottleneck Based Speaker Diarization of Meetings Using Non-Speech as Side Information. Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Florence, Italy.","DOI":"10.1109\/ICASSP.2014.6853565"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"2511","DOI":"10.1109\/TCOMM.2012.071312.110682","article-title":"Low-Precision A\/D Conversion for Maximum Information Rate in Channels with Memory","volume":"60","author":"Zeitler","year":"2012","journal-title":"IEEE Trans. Commun."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"1780","DOI":"10.1109\/TSP.2002.1011217","article-title":"An Error-Entropy Minimization Algorithm for Supervised Training of Nonlinear Adaptive Systems","volume":"50","author":"Erdogmus","year":"2002","journal-title":"IEEE Trans. Signal Process."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"5672","DOI":"10.1109\/TSP.2012.2210889","article-title":"Complex-Valued Linear and Widely Linear Filtering Using MSE and Gaussian Entropy","volume":"60","author":"Li","year":"2012","journal-title":"IEEE Trans. Signal Process."},{"key":"ref_25","unstructured":"Cover, T.M., and Thomas, J.A. (2006). Elements of Information Theory, Wiley Interscience. [2nd ed.]."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Polyanskiy, Y., and Wu, Y. (2016). Strong Data-Processing Inequalities for Channels and Bayesian Networks.","DOI":"10.1007\/978-1-4939-7005-6_7"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"1945","DOI":"10.3390\/e13111945","article-title":"A Characterization of Entropy in Terms of Information Loss","volume":"13","author":"Baez","year":"2011","journal-title":"Entropy"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"653","DOI":"10.1109\/TIT.2009.2037047","article-title":"Information Theory and Neural Information Processing","volume":"56","author":"Johnson","year":"2010","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_29","unstructured":"Geiger, B.C., and Kubin, G. (2014, January 26\u201328). Information Loss and Anti-Aliasing Filters in Multirate Systems. Proceedings of the International Zurich Seminar on Communications (IZS), Zurich, Switzerland."},{"key":"ref_30","unstructured":"Geiger, B.C., and Kubin, G. (2013, January 21\u201324). Signal Enhancement as Minimization of Relevant Information Loss. Proceedings of the ITG Conference on Systems, Communication and Coding (SCC), Munich, Germany."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Deco, G., and Obradovic, D. (1996). An Information-Theoretic Approach to Neural Computing, Springer.","DOI":"10.1007\/978-1-4612-4016-7"},{"key":"ref_32","unstructured":"Pinsker, M.S. (1964). Information and Information Stability of Random Variables and Processes, Holden Day."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"193","DOI":"10.1007\/BF02063299","article-title":"On the Dimension and Entropy of Probability Distributions","volume":"10","year":"1959","journal-title":"Acta Math. Hung."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"3721","DOI":"10.1109\/TIT.2010.2050803","article-title":"R\u00e9nyi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression","volume":"56","author":"Wu","year":"2010","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"2719","DOI":"10.1109\/TIT.2011.2181820","article-title":"Entropy of the Mixture of Sources and Entropy Dimension","volume":"58","author":"Tabor","year":"2012","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"4857","DOI":"10.1109\/TIT.2011.2158905","article-title":"MMSE Dimension","volume":"57","author":"Wu","year":"2011","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"145","DOI":"10.1016\/0375-9601(90)90690-P","article-title":"On the Information Flow for One-Dimensional Maps","volume":"144","author":"Wiegerinck","year":"1990","journal-title":"Phys. Lett. A"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1007\/BF02175553","article-title":"Positivity of Entropy Production in Nonequilibrium Statistical Mechanics","volume":"85","author":"Ruelle","year":"1996","journal-title":"J. Stat. Phys."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"47","DOI":"10.1109\/MCOM.2010.5394030","article-title":"An Overview of the IEEE 802.15.4a Standard","volume":"48","author":"Karapistoli","year":"2010","journal-title":"IEEE Commun. Mag."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"5362","DOI":"10.1109\/TIT.2009.2032707","article-title":"On the Discontinuity of the Shannon Information Measures","volume":"55","author":"Ho","year":"2009","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"259","DOI":"10.1109\/18.272494","article-title":"Relations Between Entropy and Error Probability","volume":"40","author":"Feder","year":"1994","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Yeh, J. (2000). Lectures on Real Analysis, World Scientific Publishing.","DOI":"10.1142\/4156"},{"key":"ref_43","unstructured":"Wu, Y. (2011). Shannon Theory for Compressed Sensing. [Ph.D. Thesis, Princeton University]."},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Vary, P., and Martin, R. (2006). Digital Speech Transmission: Enhancement, Coding and Error Concealment, John Wiley & Sons.","DOI":"10.1002\/0470031743"},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"120","DOI":"10.1109\/TIT.1978.1055832","article-title":"On the Entropy of Continuous Probability Distributions","volume":"24","author":"Rathie","year":"1978","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_46","unstructured":"Abramowitz, M., and Stegun, I.A. (1972). Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, Dover. [9th ed.]."},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"105","DOI":"10.1109\/2.36","article-title":"Self-Organization in a Perceptual Network","volume":"21","author":"Linsker","year":"1988","journal-title":"IEEE Comput."},{"key":"ref_48","unstructured":"Plumbley, M. (1991). On Information Theory and Unsupervised Neural Networks, University of Cambridge."},{"key":"ref_49","unstructured":"Rao, R.N. (2013). When Are the Most Informative Components for Inference Also the Principal Components?."},{"key":"ref_50","doi-asserted-by":"crossref","unstructured":"Geiger, B.C., and Kubin, G. (2013). On the Rate of Information Loss in Memoryless Systems.","DOI":"10.1109\/ITW.2012.6404738"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/18\/11\/410\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T19:35:50Z","timestamp":1760211350000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/18\/11\/410"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2016,11,17]]},"references-count":50,"journal-issue":{"issue":"11","published-online":{"date-parts":[[2016,11]]}},"alternative-id":["e18110410"],"URL":"https:\/\/doi.org\/10.3390\/e18110410","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2016,11,17]]}}}