{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,22]],"date-time":"2025-12-22T04:27:49Z","timestamp":1766377669003,"version":"3.38.0"},"reference-count":30,"publisher":"SAGE Publications","issue":"2","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["IDT"],"published-print":{"date-parts":[[2024,6,7]]},"abstract":"<jats:p>Generally, multiple choice questions are an effective and extensive form used in standard tests in order to evaluate the learner\u2019s skills and knowledge. Nonetheless, the multiple-choice question composition particularly the distractor construction is quite difficult. The distracters are needed to be both plausible and inappropriate and adequate to mystify the learners who did not master the information. Thus, the distractor generation emergence is important that can help several standard tests in an extensive range of domain. In this research, question-answer generation system is developed with a distractor model by developing an optimized T5 model. At first, BERT tokenization is used to pre-process the passage\/context and question, which are given as the input to train the approach. Then, the question and answer generation is performed by utilizing the T5 approach that is trained by proposed Serial Exponential-Slime Mould approach (SExpSMA). Exponential weighted moving average is extended to Serial Exponential weighted moving average and incorporated in Slime Mould Algorithm (SMA) to propose SExpSMA. In addition, the proposed SExpSMA-based T5 model is employed to generate distractors for the questions. Eventually, experimentation analysis exhibits that proposed SExpSMA-based T5 model achieves better outcomes regarding the metrics, like ROUGE, BLEU, and METEOR with the values of 0.919, 0.918, 0.488, respectively.<\/jats:p>","DOI":"10.3233\/idt-230629","type":"journal-article","created":{"date-parts":[[2024,3,19]],"date-time":"2024-03-19T15:53:23Z","timestamp":1710863603000},"page":"1447-1462","source":"Crossref","is-referenced-by-count":1,"title":["SExpSMA-based T5: Serial exponential-slime mould algorithm based T5 model for question answer and distractor generation"],"prefix":"10.1177","volume":"18","author":[{"given":"Nikhila","family":"T. Bhuvan","sequence":"first","affiliation":[{"name":"Information Technology, Kakkanad, Kochi, India"}]},{"given":"Jisha","family":"G","sequence":"additional","affiliation":[{"name":"Computer Science and Engineering, Rajagiri School of Engineering and Technology, Kakkanad, Kochi, India"}]},{"given":"Shamna","family":"N V","sequence":"additional","affiliation":[{"name":"Computer Science and Engineering, P.A. College of Engineering, Mangalore, India"}]}],"member":"179","reference":[{"key":"10.3233\/IDT-230629_ref1","doi-asserted-by":"crossref","first-page":"825","DOI":"10.3389\/fpsyg.2019.00825","article-title":"Multiple-choice item distractor development using topic modeling approaches","volume":"10","author":"Shin","year":"2019","journal-title":"Frontiers in Psychology"},{"key":"10.3233\/IDT-230629_ref2","doi-asserted-by":"crossref","unstructured":"Chung HL, Chan YH, Fan YC. A BERT-based distractor generation scheme with multi-tasking and negative answer training strategies. arXiv preprint arXiv: 2010.05384. 2020 Oct 12.","DOI":"10.18653\/v1\/2020.findings-emnlp.393"},{"key":"10.3233\/IDT-230629_ref3","unstructured":"Latcinnik V, Berant J. Explaining question answering models through text generation. arXiv preprint arXiv: 2004.05569. 2020 Apr 12."},{"key":"10.3233\/IDT-230629_ref4","unstructured":"Offerijns J, Verberne S, Verhoef T. Better distractions: Transformer-based distractor generation and multiple choice question filtering. arXiv preprint arXiv: 2010.09598. 2020 Oct 19."},{"key":"10.3233\/IDT-230629_ref5","doi-asserted-by":"crossref","unstructured":"Pang RY, Parrish A, Joshi N, Nangia N, Phang J, Chen A, Padmakumar V, Ma J, Thompson J, He H, Bowman SR. QuALITY: Question answering with long input texts, yes! arXiv preprint arXiv: 2112.08608. 2021 Dec 16.","DOI":"10.18653\/v1\/2022.naacl-main.391"},{"key":"10.3233\/IDT-230629_ref6","doi-asserted-by":"crossref","unstructured":"Yeh YT, Chen YN. QAInfomax: Learning robust question answering system by mutual information maximization. arXiv preprint arXiv: 1909.00215. 2019 Aug 31.","DOI":"10.18653\/v1\/D19-1333"},{"key":"10.3233\/IDT-230629_ref7","first-page":"321","article-title":"Leaf: Multiple-choice question generation","author":"Vachev","year":"2022","journal-title":"InEuropean Conference on Information Retrieval"},{"key":"10.3233\/IDT-230629_ref8","doi-asserted-by":"crossref","first-page":"75816","DOI":"10.1109\/ACCESS.2022.3191678","article-title":"Building a question answering system for the manufacturing domain","volume":"10","author":"Xingguang","year":"2022","journal-title":"IEEE Access"},{"key":"10.3233\/IDT-230629_ref9","first-page":"1","article-title":"Distractor generation with generative adversarial nets for automatically creating fill-in-the-blank questions","author":"Liang","year":"2017","journal-title":"InProceedings of the Knowledge Capture Conference"},{"issue":"4","key":"10.3233\/IDT-230629_ref10","doi-asserted-by":"crossref","first-page":"203","DOI":"10.1080\/00224065.1986.11979014","article-title":"The exponentially weighted moving average","volume":"18","author":"Hunter","year":"1986","journal-title":"Journal of Quality Technology"},{"key":"10.3233\/IDT-230629_ref12","doi-asserted-by":"crossref","first-page":"300","DOI":"10.1016\/j.future.2020.03.055","article-title":"Slime mould algorithm: A new method for stochastic optimization","volume":"111","author":"Li","year":"2020","journal-title":"Future Generation Computer Systems"},{"key":"10.3233\/IDT-230629_ref13","doi-asserted-by":"crossref","first-page":"2032","DOI":"10.1145\/3366423.3380270","article-title":"Asking questions the human way: Scalable question-answer generation from text corpus","author":"Liu","year":"2020","journal-title":"Proceedings of The Web Conference 2020"},{"key":"10.3233\/IDT-230629_ref14","doi-asserted-by":"crossref","first-page":"866","DOI":"10.18653\/v1\/D17-1090","article-title":"Question generation for question answering","author":"Duan","year":"2017","journal-title":"Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing"},{"key":"10.3233\/IDT-230629_ref15","first-page":"255","article-title":"Opinion-aware answer generation for review-driven question answering in e-commerce","author":"Deng","year":"2020","journal-title":"Proceedings of the 29th ACM International Conference on Information & Knowledge Management"},{"key":"10.3233\/IDT-230629_ref17","unstructured":"Devlin J, Chang MW, Lee K, Toutanova K. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv: 1810.04805. 2018 Oct 11."},{"key":"10.3233\/IDT-230629_ref18","doi-asserted-by":"crossref","first-page":"621","DOI":"10.1007\/978-3-030-88113-9_50","article-title":"Arabic sentiment analysis using BERT model","author":"Chouikhi","year":"2021","journal-title":"Advances in Computational Collective Intelligence: 13th International Conference, ICCCI 2021, Kallithea, Rhodes, Greece"},{"issue":"1","key":"10.3233\/IDT-230629_ref19","first-page":"5485","article-title":"Exploring the limits of transfer learning with a unified text-to-text transformer","volume":"21","author":"Raffel","year":"2020","journal-title":"The Journal of Machine Learning Research"},{"key":"10.3233\/IDT-230629_ref20","first-page":"65","article-title":"Arabic tokenization system","author":"Attia","year":"2007","journal-title":"Proceedings of the 2007 Workshop on Computational Approaches to Semitic Languages: Common Issues and Resources"},{"key":"10.3233\/IDT-230629_ref21","doi-asserted-by":"crossref","unstructured":"Qiu Z, Wu X, Fan W. Automatic distractor generation for multiple choice questions in standard tests. arXiv preprint arXiv: 2011.13100. 2020 Nov 26.","DOI":"10.18653\/v1\/2020.coling-main.189"},{"key":"10.3233\/IDT-230629_ref22","first-page":"2501","article-title":"Quiz-style question generation for news stories","author":"Lelkes","year":"2021","journal-title":"Proceedings of the Web Conference"},{"key":"10.3233\/IDT-230629_ref23","first-page":"27","article-title":"Automatic distractor generation for domain specific texts","author":"Aldabe","year":"2010","journal-title":"International Conference on Natural Language Processing"},{"key":"10.3233\/IDT-230629_ref24","doi-asserted-by":"crossref","unstructured":"Chung HL, Chan YH, Fan YC. A BERT-based distractor generation scheme with multi-tasking and negative answer training strategies. arXiv preprint arXiv: 2010.05384. 2020 Oct 12.","DOI":"10.18653\/v1\/2020.findings-emnlp.393"},{"key":"10.3233\/IDT-230629_ref25","doi-asserted-by":"crossref","first-page":"303","DOI":"10.18653\/v1\/W17-5034","article-title":"Multiple choice question generation utilizing an ontology","author":"Stasaski","year":"2017","journal-title":"Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications"},{"key":"10.3233\/IDT-230629_ref26","doi-asserted-by":"crossref","first-page":"284","DOI":"10.18653\/v1\/W18-0533","article-title":"Distractor generation for multiple choice questions using learning to rank","author":"Liang","year":"2018","journal-title":"Proceedings of the Thirteenth Workshop on Innovative use of NLP for Building Educational Applications"},{"key":"10.3233\/IDT-230629_ref27","first-page":"131","article-title":"An unsupervised query rewriting approach using N-gram co-occurrence statistics to find similar phrases in large text corpora","author":"Moen","year":"2019","journal-title":"Proceedings of the 22nd Nordic Conference on Computational Linguistics"},{"key":"10.3233\/IDT-230629_ref28","doi-asserted-by":"crossref","unstructured":"Gupta P, Pagliardini M, Jaggi M. Better word embeddings by disentangling contextual n-gram information. arXiv preprint arXiv: 1904.05033. 2019 Apr 10.","DOI":"10.18653\/v1\/N19-1098"},{"key":"10.3233\/IDT-230629_ref29","first-page":"1","article-title":"Distractor generation with generative adversarial nets for automatically creating fill-in-the-blank questions","author":"Liang","year":"2017","journal-title":"Proceedings of the Knowledge Capture Conference"},{"key":"10.3233\/IDT-230629_ref30","first-page":"1","article-title":"Replacing out-of-vocabulary words with an appropriate synonym based on Word2VnCR","volume":"2021","author":"Kim","year":"2021","journal-title":"Mobile Information Systems"},{"key":"10.3233\/IDT-230629_ref31","doi-asserted-by":"crossref","first-page":"181","DOI":"10.3115\/v1\/W14-1619","article-title":"Probabilistic modeling of joint-context in distributional similarity","author":"Melamud","year":"2014","journal-title":"Proceedings of the Eighteenth Conference on Computational Natural Language Learning"},{"key":"10.3233\/IDT-230629_ref32","unstructured":"John V. A survey of neural network techniques for feature extraction from text. arXiv preprint arXiv: 1704.08531. 2017 Apr 27."}],"container-title":["Intelligent Decision Technologies"],"original-title":[],"link":[{"URL":"https:\/\/content.iospress.com\/download?id=10.3233\/IDT-230629","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,3,10]],"date-time":"2025-03-10T19:27:03Z","timestamp":1741634823000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/full\/10.3233\/IDT-230629"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,6,7]]},"references-count":30,"journal-issue":{"issue":"2"},"URL":"https:\/\/doi.org\/10.3233\/idt-230629","relation":{},"ISSN":["1872-4981","1875-8843"],"issn-type":[{"type":"print","value":"1872-4981"},{"type":"electronic","value":"1875-8843"}],"subject":[],"published":{"date-parts":[[2024,6,7]]}}}