{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,31]],"date-time":"2026-03-31T14:35:54Z","timestamp":1774967754655,"version":"3.50.1"},"reference-count":23,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2024,3,13]],"date-time":"2024-03-13T00:00:00Z","timestamp":1710288000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2024,3,13]],"date-time":"2024-03-13T00:00:00Z","timestamp":1710288000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Discov Computing"],"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Due to the increase in the growth of data in this era of the digital world and limited resources, there is a need for more efficient data compression techniques for storing and transmitting data. Data compression can significantly reduce the amount of storage space and transmission time to store and transmit given data. More specifically, text compression has got more attention for effectively managing and processing data due to the increased use of the internet, digital devices, data transfer, etc. Over the years, various algorithms have been used for text compression such as Huffman coding, Lempel-Ziv-Welch (LZW) coding, arithmetic coding, etc. However, these methods have a limited compression ratio specifically for data storage applications where a considerable amount of data must be compressed to use storage resources efficiently. They consider individual characters to compress data. It can be more advantageous to consider words or sequences of words rather than individual characters to get a better compression ratio. Compressing individual characters results in a sizeable compressed representation due to their less repetition and structure in the data. In this paper, we proposed the ArthNgram model, in which the N-gram language model coupled with arithmetic coding is used to compress data more efficiently for data storage applications. The performance of the proposed model is evaluated based on compression ratio and compression speed. Results show that the proposed model performs better than traditional techniques.<\/jats:p>","DOI":"10.1007\/s10791-024-09431-y","type":"journal-article","created":{"date-parts":[[2024,3,13]],"date-time":"2024-03-13T11:16:29Z","timestamp":1710328589000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":6,"title":["Arithmetic N-gram: an efficient data compression technique"],"prefix":"10.1007","volume":"27","author":[{"given":"Ali","family":"Hassan","sequence":"first","affiliation":[]},{"given":"Sadaf","family":"Javed","sequence":"additional","affiliation":[]},{"given":"Sajjad","family":"Hussain","sequence":"additional","affiliation":[]},{"given":"Rizwan","family":"Ahmad","sequence":"additional","affiliation":[]},{"given":"Shams","family":"Qazi","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2024,3,13]]},"reference":[{"key":"9431_CR1","unstructured":"Statista. iot-connected-devices-worldwide. https:\/\/www.statista.com\/statistics\/1183457\/iot-connected-devices-worldwide\/. Accessed 16 Feb 2023."},{"issue":"2","key":"9431_CR2","first-page":"119","volume":"33","author":"U Jayasankar","year":"2021","unstructured":"Jayasankar U, Thirumal V, Ponnurangam D. A survey on data compression techniques: from the perspective of data quality, coding schemes, data type and applications. J King Saud Univ Comput Inform Sci. 2021;33(2):119\u201340.","journal-title":"J King Saud Univ Comput Inform Sci."},{"key":"9431_CR3","doi-asserted-by":"crossref","unstructured":"Gupta A, Nigam S. A review on different types of lossless data compression techniques 2021.","DOI":"10.32628\/CSEIT217113"},{"key":"9431_CR4","doi-asserted-by":"publisher","first-page":"44","DOI":"10.1016\/j.neucom.2018.02.094","volume":"300","author":"AJ Hussain","year":"2018","unstructured":"Hussain AJ, Al-Fayadh A, Radi N. Image compression techniques: a survey in lossless and lossy algorithms. Neurocomputing. 2018;300:44\u201369.","journal-title":"Neurocomputing"},{"key":"9431_CR5","unstructured":"Zhu X, Li J, Liu Y, Ma C, Wang W. A survey on model compression for large language models. arXiv preprint arXiv:2308.07633 2023."},{"issue":"3","key":"9431_CR6","first-page":"68","volume":"1","author":"S Shanmugasundaram","year":"2011","unstructured":"Shanmugasundaram S, Lourdusamy R. A comparative study of text compression algorithms. Int J Wisdom Based Comput. 2011;1(3):68\u201376.","journal-title":"Int J Wisdom Based Comput."},{"key":"9431_CR7","doi-asserted-by":"crossref","unstructured":"Rahman MA, Hamada M, Rahman MA. In 2021 IEEE 14th international symposium on embedded multicore\/many-core systems-on-chip (MCSoC)2021;287\u2013291","DOI":"10.1109\/MCSoC51149.2021.00049"},{"issue":"7","key":"9431_CR8","doi-asserted-by":"publisher","first-page":"1059","DOI":"10.3390\/math8071059","volume":"8","author":"M Ignatoski","year":"2020","unstructured":"Ignatoski M, Lerga J, Stankovi\u0107 L, Dakovi\u0107 M. Comparison of entropy and dictionary based text compression in English, German, French, Italian, Czech, Hungarian, Finnish, and Croatian. Mathematics. 2020;8(7):1059.","journal-title":"Mathematics"},{"issue":"3","key":"9431_CR9","first-page":"402","volume":"100","author":"M Hameed","year":"2016","unstructured":"Hameed M, Khmag A, Zaman F, Ramli AR. A new lossless method of Huffman coding for text data compression and decompression process with fpga implementation. J Eng Appl Sci. 2016;100(3):402\u20137.","journal-title":"J Eng Appl Sci."},{"key":"9431_CR10","doi-asserted-by":"publisher","first-page":"104","DOI":"10.1016\/j.bspc.2022.104127","volume":"79","author":"S Banerjee","year":"2023","unstructured":"Banerjee S, Singh GK. A new real-time lossless data compression algorithm for ECG and PPG signals. Biomed Signal Process Contr. 2023;79:104\u201327.","journal-title":"Biomed Signal Process Contr."},{"issue":"20","key":"9431_CR11","doi-asserted-by":"publisher","first-page":"28509","DOI":"10.1007\/s11042-022-12846-8","volume":"81","author":"M Otair","year":"2022","unstructured":"Otair M, Abualigah L, Qawaqzeh MK. Improved near-lossless technique using the huffman coding for enhancing the quality of image compression. Multimed Tools Appl. 2022;81(20):28509\u201329.","journal-title":"Multimed Tools Appl."},{"key":"9431_CR12","doi-asserted-by":"crossref","unstructured":"Shrividhiya G, Srujana KS, Kashyap SN, Gururaj C. In 2021 international conference on emerging smart computing and informatics (ESCI) 2021;234\u2013237","DOI":"10.1109\/ESCI50559.2021.9396785"},{"issue":"3","key":"9431_CR13","doi-asserted-by":"publisher","first-page":"127","DOI":"10.1007\/s42044-019-00047-w","volume":"3","author":"A Habib","year":"2020","unstructured":"Habib A, Islam MJ, Rahman MS. A dictionary-based text compression technique using quaternary code. Iran J Comput Sci. 2020;3(3):127\u201336.","journal-title":"Iran J Comput Sci."},{"key":"9431_CR14","doi-asserted-by":"crossref","unstructured":"Nguyen VH, Nguyen HT, Duong HN, Snasel V. n-gram-based text compression. Computational intelligence and neuroscience 2016;2016","DOI":"10.1155\/2016\/9483646"},{"key":"9431_CR15","doi-asserted-by":"crossref","unstructured":"Mantoro T, Ayu MA, Anggraini Y. In 2017 International Conference on Computing, Engineering, and Design (ICCED) (IEEE), 2017;1\u20135","DOI":"10.1109\/CED.2017.8308127"},{"key":"9431_CR16","first-page":"8887","volume":"975","author":"FTA Aburomman","year":"2016","unstructured":"Aburomman FTA. Dynamic with dictionary technique for arabic text compression. Int J Comput Appl. 2016;975:8887.","journal-title":"Int J Comput Appl."},{"key":"9431_CR17","doi-asserted-by":"crossref","unstructured":"Chatterjee A, Shah RJ, Hasan KS. In: 2018 IEEE International conference on big data (big data) 2018;5137\u20135141","DOI":"10.1109\/BigData.2018.8622282"},{"issue":"4","key":"9431_CR18","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3487045","volume":"16","author":"M Gupta","year":"2022","unstructured":"Gupta M, Agrawal P. Compression of deep learning models for text: a survey. ACM Trans Knowl Discov Data (TKDD). 2022;16(4):1\u201355.","journal-title":"ACM Trans Knowl Discov Data (TKDD)"},{"issue":"2","key":"9431_CR19","doi-asserted-by":"publisher","first-page":"155","DOI":"10.25299\/itjrd.2023.10437","volume":"7","author":"MN Fauzan","year":"2023","unstructured":"Fauzan MN, Alif M, Prianto C. Comparison of Huffman algorithm and Lempel Ziv Welch algorithm in text file compression. IT J Res Develop. 2023;7(2):155\u201369.","journal-title":"IT J Res Develop."},{"key":"9431_CR20","unstructured":"Lin J, Tang J, Tang H, Yang S, Dang X, Han S. Awq: activation-aware weight quantization for llm compression and acceleration. arXiv preprint arXiv:2306.00978 2023"},{"key":"9431_CR21","unstructured":"Ma X, Fang G, Wang X. Llm-pruner: on the structural pruning of large language models. arXiv preprint arXiv:2305.11627 2023."},{"issue":"2","key":"9431_CR22","doi-asserted-by":"publisher","first-page":"135","DOI":"10.1147\/rd.282.0135","volume":"28","author":"GG Langdon","year":"1984","unstructured":"Langdon GG. An introduction to arithmetic coding. IBM J Res Develop. 1984;28(2):135\u201349.","journal-title":"IBM J Res Develop"},{"key":"9431_CR23","doi-asserted-by":"publisher","first-page":"012007","DOI":"10.1088\/1742-6596\/1228\/1\/012007","volume":"1228","author":"HD Kotha","year":"2019","unstructured":"Kotha HD, Tummanapally M, Upadhyay VK. In J Phys Conf Ser. 2019;1228:012007.","journal-title":"In J Phys Conf Ser."}],"container-title":["Discover Computing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10791-024-09431-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s10791-024-09431-y\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10791-024-09431-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,11,14]],"date-time":"2024-11-14T06:41:33Z","timestamp":1731566493000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s10791-024-09431-y"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,3,13]]},"references-count":23,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2024,12]]}},"alternative-id":["9431"],"URL":"https:\/\/doi.org\/10.1007\/s10791-024-09431-y","relation":{},"ISSN":["2948-2992"],"issn-type":[{"value":"2948-2992","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,3,13]]},"assertion":[{"value":"3 April 2023","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"5 February 2024","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"13 March 2024","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"Not applicable.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval and consent to participate"}},{"value":"The authors have no conflicts of interest to declare relevant to the content of this work.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"1"}}