{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,24]],"date-time":"2026-01-24T18:15:48Z","timestamp":1769278548229,"version":"3.49.0"},"reference-count":35,"publisher":"Wiley","license":[{"start":{"date-parts":[[2022,1,10]],"date-time":"2022-01-10T00:00:00Z","timestamp":1641772800000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Scientific Programming"],"published-print":{"date-parts":[[2022,1,10]]},"abstract":"<jats:p>Humans have mastered the skill of creativity for many decades. The process of replicating this mechanism is introduced recently by using neural networks which replicate the functioning of human brain, where each unit in the neural network represents a neuron, which transmits the messages from one neuron to other, to perform subconscious tasks. Usually, there are methods to render an input image in the style of famous art works. This issue of generating art is normally called nonphotorealistic rendering. Previous approaches rely on directly manipulating the pixel representation of the image. While using deep neural networks which are constructed using image recognition, this paper carries out implementations in feature space representing the higher levels of the content image. Previously, deep neural networks are used for object recognition and style recognition to categorize the artworks consistent with the creation time. This paper uses Visual Geometry Group (VGG16) neural network to replicate this dormant task performed by humans. Here, the images are input where one is the content image which contains the features you want to retain in the output image and the style reference image which contains patterns or images of famous paintings and the input image which needs to be style and blend them together to produce a new image where the input image is transformed to look like the content image but \u201csketched\u201d to look like the style image.<\/jats:p>","DOI":"10.1155\/2022\/2038740","type":"journal-article","created":{"date-parts":[[2022,1,10]],"date-time":"2022-01-10T22:05:07Z","timestamp":1641852307000},"page":"1-9","source":"Crossref","is-referenced-by-count":10,"title":["Deep Convolutional Nets Learning Classification for Artistic Style Transfer"],"prefix":"10.1155","volume":"2022","author":[{"given":"R.","family":"Dinesh Kumar","sequence":"first","affiliation":[{"name":"CSE Department, Siddhartha Institute of Technology and Science, Hyderabad, India"}]},{"given":"E.","family":"Golden Julie","sequence":"additional","affiliation":[{"name":"Department of Computer Science and Engineering, Anna University Regional Campus, Tirunelveli, India"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4881-7103","authenticated-orcid":true,"given":"Y.","family":"Harold Robinson","sequence":"additional","affiliation":[{"name":"School of Information Technology and Engineering, Vellore Institute of Technology, Vellore, India"}]},{"given":"S.","family":"Vimal","sequence":"additional","affiliation":[{"name":"Department of Artificial Intelligence and Data Science, Ramco Institute of Technology, Rajapalayam, India"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-6343-5197","authenticated-orcid":true,"given":"Gaurav","family":"Dhiman","sequence":"additional","affiliation":[{"name":"Department of Computer Science, Government Bikram College of Commerce, Patiala, Punjab, India"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-8972-6366","authenticated-orcid":true,"given":"Murugesh","family":"Veerasamy","sequence":"additional","affiliation":[{"name":"Department of Computer Science, College of Computer Science, Bule Hora University, Blue Hora, Ethiopia"}]}],"member":"311","reference":[{"key":"1","doi-asserted-by":"publisher","DOI":"10.1109\/TPAMI.2017.2699184"},{"key":"2","first-page":"645","article-title":"Beautygan: instance-level facial makeup transfer with deep generative adversarial network","author":"T. Li"},{"key":"3","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-030-01246-5_37"},{"key":"4","doi-asserted-by":"publisher","DOI":"10.1109\/cvpr.2018.00012"},{"key":"5","doi-asserted-by":"publisher","DOI":"10.24963\/ijcai.2018\/485"},{"key":"6","doi-asserted-by":"publisher","DOI":"10.1109\/access.2018.2874203"},{"key":"7","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2017.11.008"},{"key":"8","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2016.07.071"},{"key":"9","doi-asserted-by":"publisher","DOI":"10.1109\/cvpr.2017.740"},{"key":"10","article-title":"Texture synthesis using convolutional neural networks","volume":"28","author":"L. A. Gatys","year":"2015","journal-title":"Advances in Neural Information Processing Systems"},{"key":"11","first-page":"1","article-title":"A neural algorithm of artistic style","author":"L. A. Gatsy"},{"key":"12","first-page":"1","article-title":"CAN: creative adversarial networks generating \u2018art\u2019 by learning about styles and deviating from style norms","author":"E. Ahmed"},{"key":"13","doi-asserted-by":"publisher","DOI":"10.1016\/j.cag.2020.01.001"},{"key":"14","doi-asserted-by":"publisher","DOI":"10.1016\/j.cag.2020.01.002"},{"key":"15","doi-asserted-by":"publisher","DOI":"10.1016\/j.image.2019.08.006"},{"key":"16","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2019.08.075"},{"key":"17","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2019.05.094"},{"key":"18","doi-asserted-by":"publisher","DOI":"10.1016\/j.imavis.2019.04.001"},{"key":"19","doi-asserted-by":"publisher","DOI":"10.1016\/j.engappai.2019.02.011"},{"key":"20","doi-asserted-by":"publisher","DOI":"10.1016\/j.optlastec.2018.06.033"},{"key":"21","doi-asserted-by":"publisher","DOI":"10.1016\/j.cmpb.2019.105268"},{"key":"22","doi-asserted-by":"publisher","DOI":"10.1016\/j.ifacol.2019.12.299"},{"key":"23","doi-asserted-by":"publisher","DOI":"10.1016\/j.neuron.2015.09.003"},{"key":"24","doi-asserted-by":"crossref","article-title":"Interactive style transfer: towards styling user-specified object","author":"J. J. Virtusio","DOI":"10.1109\/VCIP.2018.8698689"},{"key":"25","article-title":"A simple way of multimodal and arbitrary style transfer","author":"A.-D. Nguyen"},{"key":"26","doi-asserted-by":"crossref","article-title":"Cross-modal style transfer","author":"S. Chelaramani","DOI":"10.1109\/ICIP.2018.8451734"},{"key":"27","doi-asserted-by":"publisher","DOI":"10.1155\/2014\/781950"},{"key":"28","doi-asserted-by":"publisher","DOI":"10.1155\/2018\/9893867"},{"key":"29","doi-asserted-by":"publisher","DOI":"10.1155\/2020\/8894309"},{"key":"30","doi-asserted-by":"publisher","DOI":"10.1155\/2021\/9312425"},{"key":"31","doi-asserted-by":"publisher","DOI":"10.1016\/j.jvcir.2021.103378"},{"key":"32","doi-asserted-by":"publisher","DOI":"10.1016\/j.jag.2021.102590"},{"key":"33","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2021.08.088"},{"key":"34","doi-asserted-by":"publisher","DOI":"10.1504\/IJBIC.2020.10027535"},{"key":"35","doi-asserted-by":"publisher","DOI":"10.32604\/cmc.2022.017789"}],"container-title":["Scientific Programming"],"original-title":[],"language":"en","link":[{"URL":"http:\/\/downloads.hindawi.com\/journals\/sp\/2022\/2038740.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/downloads.hindawi.com\/journals\/sp\/2022\/2038740.xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/downloads.hindawi.com\/journals\/sp\/2022\/2038740.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,1,10]],"date-time":"2022-01-10T22:05:10Z","timestamp":1641852310000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.hindawi.com\/journals\/sp\/2022\/2038740\/"}},"subtitle":[],"editor":[{"given":"Antonio J.","family":"Pe\u00f1a","sequence":"additional","affiliation":[]}],"short-title":[],"issued":{"date-parts":[[2022,1,10]]},"references-count":35,"alternative-id":["2038740","2038740"],"URL":"https:\/\/doi.org\/10.1155\/2022\/2038740","relation":{},"ISSN":["1875-919X","1058-9244"],"issn-type":[{"value":"1875-919X","type":"electronic"},{"value":"1058-9244","type":"print"}],"subject":[],"published":{"date-parts":[[2022,1,10]]}}}