{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,5]],"date-time":"2026-02-05T07:19:32Z","timestamp":1770275972548,"version":"3.49.0"},"publisher-location":"New York, NY, USA","reference-count":10,"publisher":"ACM","license":[{"start":{"date-parts":[[2023,10,21]],"date-time":"2023-10-21T00:00:00Z","timestamp":1697846400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2023,10,21]]},"DOI":"10.1145\/3583780.3615509","type":"proceedings-article","created":{"date-parts":[[2023,10,21]],"date-time":"2023-10-21T07:45:42Z","timestamp":1697874342000},"page":"5240-5241","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":1,"title":["Comparing Fine-Tuned Transformers and Large Language Models for Sales Call Classification: A Case Study"],"prefix":"10.1145","author":[{"ORCID":"https:\/\/orcid.org\/0009-0007-2140-0694","authenticated-orcid":false,"given":"Roy","family":"Eisenstadt","sequence":"first","affiliation":[{"name":"Microsoft Dynamics 365 Sales, Tel Aviv, Israel"}]},{"ORCID":"https:\/\/orcid.org\/0009-0006-2985-4554","authenticated-orcid":false,"given":"Abedelkader","family":"Asi","sequence":"additional","affiliation":[{"name":"Microsoft Dynamics 365 Sales, Sammamish, WA, USA"}]},{"ORCID":"https:\/\/orcid.org\/0009-0008-4993-6623","authenticated-orcid":false,"given":"Royi","family":"Ronen","sequence":"additional","affiliation":[{"name":"Microsoft Dynamics 365 Sales, Tel Aviv, Israel"}]}],"member":"320","published-online":{"date-parts":[[2023,10,21]]},"reference":[{"key":"e_1_3_2_1_1_1","unstructured":"Iz Beltagy Matthew E. Peters and Arman Cohan. 2020. Longformer: the long document transformer. ArXiv abs\/2004.05150.  Iz Beltagy Matthew E. Peters and Arman Cohan. 2020. Longformer: the long document transformer. ArXiv abs\/2004.05150."},{"key":"e_1_3_2_1_2_1","volume-title":"Brown et al","author":"Tom","year":"2020","unstructured":"Tom B. Brown et al . 2020 . Language models are few-shot learners. ArXiv , abs\/2005.14165. Tom B. Brown et al. 2020. Language models are few-shot learners. ArXiv, abs\/2005.14165."},{"key":"e_1_3_2_1_3_1","unstructured":"Jacob Devlin Ming-Wei Chang Kenton Lee and Kristina Toutanova. 2019. Bert: pre-training of deep bidirectional transformers for language understanding. ArXiv abs\/1810.04805.  Jacob Devlin Ming-Wei Chang Kenton Lee and Kristina Toutanova. 2019. Bert: pre-training of deep bidirectional transformers for language understanding. ArXiv abs\/1810.04805."},{"key":"e_1_3_2_1_4_1","doi-asserted-by":"publisher","DOI":"10.1109\/TBDATA.2019.2921572"},{"key":"e_1_3_2_1_5_1","unstructured":"Yinhan Liu et al. 2019. Roberta: a robustly optimized bert pretraining approach. ArXiv abs\/1907.11692.  Yinhan Liu et al. 2019. Roberta: a robustly optimized bert pretraining approach. ArXiv abs\/1907.11692."},{"key":"e_1_3_2_1_6_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/D19-1410"},{"key":"e_1_3_2_1_7_1","unstructured":"Victor Sanh Lysandre Debut Julien Chaumond and Thomas Wolf. 2019. Distilbert a distilled version of bert: smaller faster cheaper and lighter. ArXiv abs\/1910.01108.  Victor Sanh Lysandre Debut Julien Chaumond and Thomas Wolf. 2019. Distilbert a distilled version of bert: smaller faster cheaper and lighter. ArXiv abs\/1910.01108."},{"key":"e_1_3_2_1_8_1","unstructured":"Ashish Vaswani Noam M. Shazeer Niki Parmar Jakob Uszkoreit Llion Jones Aidan N. Gomez Lukasz Kaiser and Illia Polosukhin. 2017. Attention is all you need. In NIPS.  Ashish Vaswani Noam M. Shazeer Niki Parmar Jakob Uszkoreit Llion Jones Aidan N. Gomez Lukasz Kaiser and Illia Polosukhin. 2017. Attention is all you need. In NIPS."},{"key":"e_1_3_2_1_9_1","unstructured":"Thomas Wolf et al. 2019. Huggingface's transformers: state-of-the-art natural language processing. ArXiv abs\/1910.03771.  Thomas Wolf et al. 2019. Huggingface's transformers: state-of-the-art natural language processing. ArXiv abs\/1910.03771."},{"key":"e_1_3_2_1_10_1","unstructured":"Manzil Zaheer et al. 2020. Big bird: transformers for longer sequences. ArXiv abs\/2007.14062.  Manzil Zaheer et al. 2020. Big bird: transformers for longer sequences. ArXiv abs\/2007.14062."}],"event":{"name":"CIKM '23: The 32nd ACM International Conference on Information and Knowledge Management","location":"Birmingham United Kingdom","acronym":"CIKM '23","sponsor":["SIGWEB ACM Special Interest Group on Hypertext, Hypermedia, and Web","SIGIR ACM Special Interest Group on Information Retrieval"]},"container-title":["Proceedings of the 32nd ACM International Conference on Information and Knowledge Management"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3583780.3615509","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3583780.3615509","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T16:36:55Z","timestamp":1750178215000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3583780.3615509"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,10,21]]},"references-count":10,"alternative-id":["10.1145\/3583780.3615509","10.1145\/3583780"],"URL":"https:\/\/doi.org\/10.1145\/3583780.3615509","relation":{},"subject":[],"published":{"date-parts":[[2023,10,21]]},"assertion":[{"value":"2023-10-21","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}