{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,15]],"date-time":"2026-03-15T15:27:09Z","timestamp":1773588429747,"version":"3.50.1"},"reference-count":64,"publisher":"Association for Computing Machinery (ACM)","issue":"3","content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["ACM J. Comput. Sustain. Soc."],"published-print":{"date-parts":[[2025,9,30]]},"abstract":"<jats:p>The increasing computational demands and related carbon emissions of Artificial Intelligence (AI) models necessitate careful consideration of their environmental impact. This concern is especially important when these models are trained in regions with energy shortages or carbon-intensive power grids (such as coal, natural gas, or oil), as their operational footprint can worsen environmental strain. To address this challenge, we develop a behavior-driven computational framework for selecting AI models that balance accuracy and environmental sustainability. To capture real-world decision-making under sustainability constraints, we propose a behavior-driven framework that analyzes tradeoffs between energy use, carbon emissions, and model performance using insights from Behavioral Portfolio Theory and Cumulative Prospect Theory. We quantify emissions from energy use during supervised fine-tuning across multiple model architectures and datasets, incorporating these results within a multi-objective decision model. Our framework provides actionable insights for selecting AI models that minimize environmental impact without sacrificing accuracy. By prioritizing models that balance performance and sustainability, we provide AI developers with practical tools to lower the carbon footprint of AI technologies.<\/jats:p>","DOI":"10.1145\/3736646","type":"journal-article","created":{"date-parts":[[2025,5,20]],"date-time":"2025-05-20T07:18:58Z","timestamp":1747725538000},"page":"1-20","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":3,"title":["A Behavioral Finance Framework for Balancing AI Accuracy and Operational Carbon Emissions"],"prefix":"10.1145","volume":"3","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-0546-9444","authenticated-orcid":false,"given":"Aggrey","family":"Muhebwa","sequence":"first","affiliation":[{"name":"Stanford University","place":["Stanford, United States"]}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9804-8322","authenticated-orcid":false,"given":"Khalid K.","family":"Osman","sequence":"additional","affiliation":[{"name":"Stanford University","place":["Stanford, United States"]}]}],"member":"320","published-online":{"date-parts":[[2025,7,15]]},"reference":[{"key":"e_1_3_2_2_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.jenvp.2005.08.002"},{"key":"e_1_3_2_3_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.jpubeco.2011.03.003"},{"key":"e_1_3_2_4_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.esr.2023.101159"},{"key":"e_1_3_2_5_2","unstructured":"Lasse F. Wolff Anthony Benjamin Kanding and Raghavendra Selvan. 2020. Carbontracker: Tracking and predicting the carbon footprint of training deep learning models. arXiv preprint arXiv:2007.03051 (2020)."},{"key":"e_1_3_2_6_2","article-title":"Gossip-based actor-learner architectures for deep reinforcement learning","volume":"32","author":"Assran Mahmoud","year":"2019","unstructured":"Mahmoud Assran, Joshua Romoff, Nicolas Ballas, Joelle Pineau, and Michael Rabbat. 2019. Gossip-based actor-learner architectures for deep reinforcement learning. Advances in Neural Information Processing Systems 32 (2019), 13. https:\/\/www.sciencedirect.com\/science\/article\/abs\/pii\/S0959652621042992?via%3Dihub","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_7_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICASSP48485.2024.10448303"},{"key":"e_1_3_2_8_2","doi-asserted-by":"publisher","DOI":"10.1016\/S1574-0102(03)01027-6"},{"key":"e_1_3_2_9_2","unstructured":"Noman Bashir Priya Donti James Cuff Sydney Sroka Marija Ilic Vivienne Sze Christina Delimitrou and Elsa Olivetti. 2024. The climate and sustainability implications of generative AI. (2024)."},{"key":"e_1_3_2_10_2","first-page":"1877","article-title":"Language models are few-shot learners","volume":"33","author":"Brown Tom","year":"2020","unstructured":"Tom Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared D Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, et\u00a0al. 2020. Language models are few-shot learners. Advances in Neural Information Processing Systems 33 (2020), 1877\u20131901.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_11_2","first-page":"622","volume-title":"Proceedings of the Asian Conference on Machine Learning","author":"Cai Ermao","year":"2017","unstructured":"Ermao Cai, Da-Cheng Juan, Dimitrios Stamoulis, and Diana Marculescu. 2017. Neuralpower: Predict and deploy energy-efficient convolutional neural networks. In Proceedings of the Asian Conference on Machine Learning. PMLR, 622\u2013637."},{"key":"e_1_3_2_12_2","unstructured":"Han Cai Chuang Gan Tianzhe Wang Zhekai Zhang and Song Han. 2019. Once-for-all: Train one network and specialize it for efficient deployment. arXiv preprint arXiv:1908.09791 (2019)."},{"key":"e_1_3_2_13_2","unstructured":"Alfredo Canziani Adam Paszke and Eugenio Culurciello. 2016. An analysis of deep neural network models for practical applications. arXiv preprint arXiv:1605.07678 (2016)."},{"key":"e_1_3_2_14_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.jclepro.2021.130133"},{"issue":"240","key":"e_1_3_2_15_2","first-page":"1","article-title":"Palm: Scaling language modeling with pathways","volume":"24","author":"Chowdhery Aakanksha","year":"2023","unstructured":"Aakanksha Chowdhery, Sharan Narang, Jacob Devlin, Maarten Bosma, Gaurav Mishra, Adam Roberts, Paul Barham, Hyung Won Chung, Charles Sutton, Sebastian Gehrmann, et\u00a0al. 2023. Palm: Scaling language modeling with pathways. Journal of Machine Learning Research 24, 240 (2023), 1\u2013113.","journal-title":"Journal of Machine Learning Research"},{"key":"e_1_3_2_16_2","doi-asserted-by":"publisher","DOI":"10.1086\/674872"},{"key":"e_1_3_2_17_2","doi-asserted-by":"publisher","unstructured":"\u0130stemi \u00c7\u00f6mlek\u00e7i and Ali \u00d6zer. 2018. Behavioral finance models anomalies and factors affecting investor psychology. In Global Approaches in Financial Economics Banking and Finance Hasan Dincer \u00dcmit Hacioglu and Serhat Y\u00fcksel (Eds.). Springer International Publishing Cham 309\u2013330. DOI:10.1007\/978-3-319-78494-6_15","DOI":"10.1007\/978-3-319-78494-6_15"},{"key":"e_1_3_2_18_2","doi-asserted-by":"publisher","DOI":"10.1111\/joes.12262"},{"key":"e_1_3_2_19_2","doi-asserted-by":"publisher","unstructured":"Benoit Courty Victor Schmidt Goyal-Kamal MarionCoutarel Boris Feld J\u00e9r\u00e9my Lecourt LiamConnell SabAmine inimaz supatomic Mathilde L\u00e9val Luis Blanche Alexis Cruveiller ouminasara Franklin Zhao Aditya Joshi Alexis Bogroff Amine Saboni Hugues de Lavoreille Niko Laskaris Edoardo Abati Douglas Blank Ziyao Wang Armin Catovic alencon Micha\u0142 St\u0229ch\u0142y Christian Bauer Lucas-Otavio JPW and MinervaBooks. 2024. mlco2\/codecarbon: v2.4.1. Zenodo. DOI:10.5281\/zenodo.11171501","DOI":"10.5281\/zenodo.11171501"},{"key":"e_1_3_2_20_2","doi-asserted-by":"publisher","DOI":"10.1038\/s41598-023-40021-y"},{"key":"e_1_3_2_21_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-0-85729-652-8_1"},{"key":"e_1_3_2_22_2","doi-asserted-by":"publisher","DOI":"10.1109\/TEVC.2013.2281535"},{"key":"e_1_3_2_23_2","doi-asserted-by":"crossref","unstructured":"Jacob Devlin Ming-Wei Chang Kenton Lee and Kristina Toutanova. 2019. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (long and short papers) Vol. 1. 4171\u20134186.","DOI":"10.18653\/v1\/N19-1423"},{"key":"e_1_3_2_24_2","doi-asserted-by":"publisher","DOI":"10.1145\/3531146.3533234"},{"key":"e_1_3_2_25_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-030-01252-6_32"},{"key":"e_1_3_2_26_2","unstructured":"Alexey Dosovitskiy Lucas Beyer Alexander Kolesnikov Dirk Weissenborn Xiaohua Zhai Thomas Unterthiner Mostafa Dehghani Matthias Minderer Georg Heigold Sylvain Gelly and others. 2020. An image is worth \\(16\\times 16\\) words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929 (2020)."},{"key":"e_1_3_2_27_2","unstructured":"U.S. Energy Information Administration. (n.d.). Electricity. U.S. Department of Energy. Retrieved June 15 2024 from https:\/\/www.eia.gov\/electricity\/"},{"key":"e_1_3_2_28_2","unstructured":"Mikhail Evchenko Joaquin Vanschoren Holger H. Hoos Marc Schoenauer and Mich\u00e8le Sebag. 2021. Frugal machine learning. arXiv preprint arXiv:2111.03731 (2021)."},{"key":"e_1_3_2_29_2","doi-asserted-by":"publisher","DOI":"10.1257\/jep.32.4.53"},{"key":"e_1_3_2_30_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.buildenv.2003.10.008"},{"key":"e_1_3_2_31_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijme.2019.100330"},{"key":"e_1_3_2_32_2","unstructured":"Song Han Huizi Mao and William J. Dally. 2015. Deep compression: Compressing deep neural networks with pruning trained quantization and huffman coding. arXiv preprint arXiv:1510.00149 (2015)."},{"key":"e_1_3_2_33_2","doi-asserted-by":"publisher","DOI":"10.1109\/IGARSS.2018.8519248"},{"issue":"248","key":"e_1_3_2_34_2","first-page":"1","article-title":"Towards the systematic reporting of the energy and carbon footprints of machine learning","volume":"21","author":"Henderson Peter","year":"2020","unstructured":"Peter Henderson, Jieru Hu, Joshua Romoff, Emma Brunskill, Dan Jurafsky, and Joelle Pineau. 2020. Towards the systematic reporting of the energy and carbon footprints of machine learning. Journal of Machine Learning Research 21, 248 (2020), 1\u201343. Retrieved from http:\/\/jmlr.org\/papers\/v21\/20-312.html","journal-title":"Journal of Machine Learning Research"},{"key":"e_1_3_2_35_2","unstructured":"Geoffrey Hinton Oriol Vinyals and Jeff Dean. 2015. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531 (2015)."},{"key":"e_1_3_2_36_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2018.00286"},{"key":"e_1_3_2_37_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.ecolecon.2004.12.027"},{"key":"e_1_3_2_38_2","unstructured":"Alexandre Lacoste Alexandra Luccioni Victor Schmidt and Thomas Dandres. 2019. Quantifying the carbon emissions of machine learning. arXiv preprint arXiv:1910.09700 (2019)."},{"key":"e_1_3_2_39_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICCV48922.2021.00986"},{"key":"e_1_3_2_40_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.01167"},{"issue":"253","key":"e_1_3_2_41_2","first-page":"1","article-title":"Estimating the carbon footprint of bloom, a 176b parameter language model","volume":"24","author":"Luccioni Alexandra Sasha","year":"2023","unstructured":"Alexandra Sasha Luccioni, Sylvain Viguier, and Anne-Laure Ligozat. 2023. Estimating the carbon footprint of bloom, a 176b parameter language model. Journal of Machine Learning Research 24, 253 (2023), 1\u201315.","journal-title":"Journal of Machine Learning Research"},{"key":"e_1_3_2_42_2","doi-asserted-by":"publisher","DOI":"10.1145\/3341302.3342080"},{"key":"e_1_3_2_43_2","doi-asserted-by":"publisher","DOI":"10.1109\/JSAC.2016.2611964"},{"key":"e_1_3_2_44_2","volume-title":"Nonlinear Multiobjective Optimization","author":"Miettinen Kaisa","year":"1999","unstructured":"Kaisa Miettinen. 1999. Nonlinear Multiobjective Optimization. Springer Science and Business Media."},{"key":"e_1_3_2_45_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v37i12.26686"},{"key":"e_1_3_2_46_2","doi-asserted-by":"publisher","DOI":"10.1145\/3608112"},{"key":"e_1_3_2_47_2","doi-asserted-by":"publisher","DOI":"10.1109\/CLOUD62652.2024.00062"},{"key":"e_1_3_2_48_2","doi-asserted-by":"publisher","DOI":"10.1109\/MC.2022.3148714"},{"key":"e_1_3_2_49_2","unstructured":"David Patterson Joseph Gonzalez Quoc Le Chen Liang Lluis-Miquel Munguia Daniel Rothchild David So Maud Texier and Jeff Dean. 2021. Carbon emissions and large neural network training. arXiv preprint arXiv:2104.10350 (2021)."},{"key":"e_1_3_2_50_2","doi-asserted-by":"publisher","DOI":"10.1109\/TPWRS.2022.3173250"},{"key":"e_1_3_2_51_2","first-page":"8821","volume-title":"Proceedings of the International Conference on Machine Learning","author":"Ramesh Aditya","year":"2021","unstructured":"Aditya Ramesh, Mikhail Pavlov, Gabriel Goh, Scott Gray, Chelsea Voss, Alec Radford, Mark Chen, and Ilya Sutskever. 2021. Zero-shot text-to-image generation. In Proceedings of the International Conference on Machine Learning. Pmlr, 8821\u20138831."},{"key":"e_1_3_2_52_2","doi-asserted-by":"publisher","DOI":"10.1002\/9781118258415.ch8"},{"key":"e_1_3_2_53_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2018.00474"},{"key":"e_1_3_2_54_2","doi-asserted-by":"publisher","DOI":"10.1145\/3381831"},{"key":"e_1_3_2_55_2","unstructured":"Raghavendra Selvan Bob Pepin Christian Igel Gabrielle Samuel and Erik B. Dam. 2025. PePR: Performance Per Resource Unit as a Metric to Promote Small-scale Deep Learning. In Northern Lights Deep Learning Conference PMLR 220\u2013229."},{"key":"e_1_3_2_56_2","doi-asserted-by":"publisher","DOI":"10.2307\/2676187"},{"key":"e_1_3_2_57_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICIPTM54933.2022.9753944"},{"key":"e_1_3_2_58_2","doi-asserted-by":"crossref","unstructured":"Emma Strubell Ananya Ganesh and Andrew McCallum. 2020. Energy and policy considerations for modern deep learning research. In Proceedings of the AAAI Conference on Artificial Intelligence. 13693\u201313696.","DOI":"10.1609\/aaai.v34i09.7123"},{"key":"e_1_3_2_59_2","doi-asserted-by":"publisher","DOI":"10.1057\/9781137366290"},{"key":"e_1_3_2_60_2","first-page":"6105","volume-title":"Proceedings of the International Conference on Machine Learning","author":"Tan Mingxing","year":"2019","unstructured":"Mingxing Tan and Quoc Le. 2019. Efficientnet: Rethinking model scaling for convolutional neural networks. In Proceedings of the International Conference on Machine Learning. PMLR, 6105\u20136114."},{"key":"e_1_3_2_61_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICAC.2006.1662383"},{"key":"e_1_3_2_62_2","first-page":"10347","volume-title":"Proceedings of the International Conference on Machine Learning","author":"Touvron Hugo","year":"2021","unstructured":"Hugo Touvron, Matthieu Cord, Matthijs Douze, Francisco Massa, Alexandre Sablayrolles, and Herv\u00e9 J\u00e9gou. 2021. Training data-efficient image transformers and distillation through attention. In Proceedings of the International Conference on Machine Learning. PMLR, 10347\u201310357."},{"key":"e_1_3_2_63_2","doi-asserted-by":"publisher","DOI":"10.1007\/BF00122574"},{"key":"e_1_3_2_64_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPRW63382.2024.00545"},{"key":"e_1_3_2_65_2","doi-asserted-by":"publisher","DOI":"10.1145\/3604930.3605708"}],"container-title":["ACM Journal on Computing and Sustainable Societies"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3736646","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,3,15]],"date-time":"2026-03-15T13:41:27Z","timestamp":1773582087000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3736646"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,7,15]]},"references-count":64,"journal-issue":{"issue":"3","published-print":{"date-parts":[[2025,9,30]]}},"alternative-id":["10.1145\/3736646"],"URL":"https:\/\/doi.org\/10.1145\/3736646","relation":{},"ISSN":["2834-5533"],"issn-type":[{"value":"2834-5533","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,7,15]]},"assertion":[{"value":"2024-09-25","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-05-06","order":2,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-07-15","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}