{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,1]],"date-time":"2026-02-01T05:12:53Z","timestamp":1769922773638,"version":"3.49.0"},"reference-count":33,"publisher":"MDPI AG","issue":"1","license":[{"start":{"date-parts":[[2024,12,25]],"date-time":"2024-12-25T00:00:00Z","timestamp":1735084800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"European Union\u2019s Horizon 2020 WIDESPREAD-2018\u20132020 TEAMING Phase 2 Program","award":["857155"],"award-info":[{"award-number":["857155"]}]},{"name":"European Union\u2019s Horizon 2020 WIDESPREAD-2018\u20132020 TEAMING Phase 2 Program","award":["BG05M2OP001-1.003-0002-C01"],"award-info":[{"award-number":["BG05M2OP001-1.003-0002-C01"]}]},{"name":"European Union\u2019s Horizon 2020 WIDESPREAD-2018\u20132020 TEAMING Phase 2 Program","award":["35401"],"award-info":[{"award-number":["35401"]}]},{"name":"Operational Program Science and Education for Smart Growth","award":["857155"],"award-info":[{"award-number":["857155"]}]},{"name":"Operational Program Science and Education for Smart Growth","award":["BG05M2OP001-1.003-0002-C01"],"award-info":[{"award-number":["BG05M2OP001-1.003-0002-C01"]}]},{"name":"Operational Program Science and Education for Smart Growth","award":["35401"],"award-info":[{"award-number":["35401"]}]},{"name":"Remote Detection of Motor Paroxysms (REDEMP)","award":["857155"],"award-info":[{"award-number":["857155"]}]},{"name":"Remote Detection of Motor Paroxysms (REDEMP)","award":["BG05M2OP001-1.003-0002-C01"],"award-info":[{"award-number":["BG05M2OP001-1.003-0002-C01"]}]},{"name":"Remote Detection of Motor Paroxysms (REDEMP)","award":["35401"],"award-info":[{"award-number":["35401"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Information"],"abstract":"<jats:p>Motivation. The visual tracking of patients with specific adverse conditions such as epileptic seizures is an important task related to the prevention of unwanted medical situations and events. Previously, we have developed algorithms for contactless patient tracking based on optical flow analysis. In this work, we address some of the challenges faced by the single-camera tracking system and expand its functionalities by employing simultaneous input from multiple cameras. Methods. We propose a new approach of fusing multiple camera sensors. It uses a proprietary motion-group parameter reconstruction algorithm and includes scenarios of both overlapping and non-overlapping fields of view. In the first case, simultaneous tracking within the overlapping field evolves from independent tracking by each camera to synchronized tracking by a set of cameras. This is achieved by automated reinforcement learning and simultaneously applying the interdependences between the cameras. In addition, outside the overlapping areas, the algorithm can transfer tracking from one camera to another, provided a tree-type topology between the areas is present. Results. We demonstrate that synchronous, multi-camera tracking scenarios provide improvements in both real-world and simulated tests. This new approach allows for improving the accuracy and robustness of the original methods, to extend the tracking coverage, and to provide other beneficial effects, such as more precise detection of fast-moving objects. The proposed method is compared with other algorithms used in the field.<\/jats:p>","DOI":"10.3390\/info16010004","type":"journal-article","created":{"date-parts":[[2024,12,25]],"date-time":"2024-12-25T19:29:52Z","timestamp":1735154992000},"page":"4","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["Multiple-Camera Patient Tracking Method Based on Motion-Group Parameter Reconstruction"],"prefix":"10.3390","volume":"16","author":[{"ORCID":"https:\/\/orcid.org\/0009-0003-5383-3030","authenticated-orcid":false,"given":"Simeon","family":"Karpuzov","sequence":"first","affiliation":[{"name":"GATE Institute, Sofia University, 1164 Sofia, Bulgaria"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-0205-585X","authenticated-orcid":false,"given":"George","family":"Petkov","sequence":"additional","affiliation":[{"name":"GATE Institute, Sofia University, 1164 Sofia, Bulgaria"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-7028-7778","authenticated-orcid":false,"given":"Stiliyan","family":"Kalitzin","sequence":"additional","affiliation":[{"name":"Stichting Epilepsie Instellingen Nederland (SEIN), Achterweg 5, 2103 SW Heemstede, The Netherlands"},{"name":"Image Sciences Institute, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX Utrecht, The Netherlands"}]}],"member":"1968","published-online":{"date-parts":[[2024,12,25]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"347","DOI":"10.1053\/seiz.1999.0306","article-title":"SUDEP: Overview of definitions and review of incidence data","volume":"8","author":"Annegers","year":"1999","journal-title":"Seizure"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"55","DOI":"10.1179\/016164104773026534","article-title":"A neural-network-based detection of epilepsy","volume":"26","author":"Nigam","year":"2004","journal-title":"Neurol. Res."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Sopic, D., Aminifar, A., and Atienza, D. (2018, January 27\u201330). e-glass: A wearable system for real-time detection of epileptic seizures. Proceedings of the 2018 IEEE International Symposium on Circuits and Systems (ISCAS), Florence, Italy.","DOI":"10.1109\/ISCAS.2018.8351728"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Ahammad, N., Fathima, T., and Joseph, P. (2014). Detection of epileptic seizure event and onset using EEG. BioMed Res. Int., 2014.","DOI":"10.1155\/2014\/450573"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1133","DOI":"10.1016\/j.cmpb.2012.08.005","article-title":"Vision-based motion detection, analysis and recognition of epileptic seizures\u2014A systematic review","volume":"108","author":"Pediaditis","year":"2012","journal-title":"Comput. Methods Programs Biomed."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Cuppens, K., Vanrumste, B., Ceulemans, B., Lagae, L., and Van Huffel, S. (2010, January 19\u201321). Detection of epileptic seizures using video data. Proceedings of the 2010 Sixth International Conference on Intelligent Environments, Kuala Lumpur, Malaysia.","DOI":"10.1109\/IE.2010.77"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"3379","DOI":"10.1109\/TBME.2012.2215609","article-title":"Automatic segmentation of episodes containing epileptic clonic seizures in video sequences","volume":"59","author":"Kalitzin","year":"2012","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Karpuzov, S., Petkov, G., Ilieva, S., Petkov, A., and Kalitzin, S. (2024). Object Tracking Based on Optical Flow Reconstruction of Motion-Group Parameters. Information, 15.","DOI":"10.20944\/preprints202403.0747.v1"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Aires, K.R., Santana, A.M., and Medeiros, A.A. (2008, January 16\u201320). Optical flow using color information: Preliminary results. Proceedings of the 2008 ACM Symposium on Applied Computing, Fortaleza, Brazil.","DOI":"10.1145\/1363686.1364064"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Maas, R., Romeny, B.M.T.H., and Viergever, M.A. (1999, January 26\u201327). A Multiscale Taylor Series Approaches to Optic Flow and Stereo: A Generalization of Optic Flow under the Aperture. Proceedings of the Scale-Space Theories in Computer Vision: Second International Conference, Scale-Space\u201999, Corfu, Greece. Proceedings 2.","DOI":"10.1007\/3-540-48236-9_53"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Niessen, W., and Maas, R. (1996). Multiscale optic flow and stereo. Gaussian Scale-Space Theory, Computational Imaging and Vision, Kluwer Academic Publishers.","DOI":"10.1007\/978-94-015-8802-7_3"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"433","DOI":"10.1145\/212094.212141","article-title":"The computation of optical flow","volume":"27","author":"Beauchemin","year":"1995","journal-title":"ACM Comput. Surv. (CSUR)"},{"key":"ref_13","unstructured":"Niessen, W.J., Duncan, J.S., Florack, L.M.J., and Viergever, M.A. (1995, January 18\u201319). Spatiotemporal operators and optic flow. Proceedings of the Workshop on Physics-Based Modeling in Computer Vision, Cambridge, MA, USA."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"161","DOI":"10.1016\/0042-6989(86)90078-7","article-title":"Optic flow","volume":"26","author":"Koenderink","year":"1986","journal-title":"Vis. Res."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"185","DOI":"10.1016\/0004-3702(81)90024-2","article-title":"Determining optical flow","volume":"17","author":"Horn","year":"1980","journal-title":"Artif. Intell."},{"key":"ref_16","unstructured":"Kalitzin, S., Geertsema, E.E., and Petkov, G. (2018). Optical Flow Group-Parameter Reconstruction from Multi-Channel Image Sequences. Applications of Intelligent Systems, IOS Press."},{"key":"ref_17","unstructured":"Figueira, D., Taiana, M., Nambiar, A., Nascimento, J., and Bernardino, A. (12, January 6\u20137). The HDA+ data set for research on fully automated re-identification systems. Proceedings of the Computer Vision-ECCV 2014 Workshops, Zurich, Switzerland. Proceedings, Part III 13."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"540","DOI":"10.1109\/TCSVT.2016.2556538","article-title":"From the lab to the real world: Re-identification in an airport camera network","volume":"27","author":"Camps","year":"2016","journal-title":"IEEE Trans. Circuits Syst. Video Technol."},{"key":"ref_19","unstructured":"Zhang, P., Tao, Z., Yang, W., Chen, M., Ding, S., Liu, X., Yang, R., and Zhang, H. (2021). Unveiling personnel movement in a larger indoor area with a non-overlapping multi-camera system. arXiv."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"43","DOI":"10.1016\/j.cviu.2007.06.005","article-title":"Incremental, scalable tracking of objects inter camera","volume":"111","author":"Gilbert","year":"2008","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Porikli, F., and Divakaran, A. (2003, January 6\u20139). Multi-camera calibration, object tracking and query generation. Proceedings of the 2003 International Conference on Multimedia and Expo. ICME\u201903, Baltimore, MD, USA. Proceedings (Cat. No. 03TH8698).","DOI":"10.1109\/ICME.2003.1221002"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Dick, A.R., and Brooks, M.J. (2004, January 4\u20136). A stochastic approach to tracking objects across multiple cameras. Proceedings of the Australasian Joint Conference on Artificial Intelligence, Cairns, Australia.","DOI":"10.1007\/978-3-540-30549-1_15"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"13386","DOI":"10.1109\/TCSVT.2024.3447670","article-title":"Enhancing Multi-Camera Gymnast Tracking Through Domain Knowledge Integration","volume":"34","author":"Yang","year":"2024","journal-title":"IEEE Trans. Circuits Syst. Video Technol."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"126558","DOI":"10.1016\/j.neucom.2023.126558","article-title":"Multi-camera multi-object tracking: A review of current trends and future advances","volume":"552","author":"Amosa","year":"2023","journal-title":"Neurocomputing"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Fei, L., and Han, B. (2023). Multi-object multi-camera tracking based on deep learning for intelligent transportation: A review. Sensors, 23.","DOI":"10.3390\/s23083852"},{"key":"ref_26","first-page":"32","article-title":"A Review on Object Tracking Across Real-World Multi Camera Environment","volume":"174","author":"Cherian","year":"2021","journal-title":"Int. J. Comput. Appl."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"137","DOI":"10.1007\/s41095-023-0334-8","article-title":"A unified multi-view multi-person tracking framework","volume":"10","author":"Yang","year":"2024","journal-title":"Comput. Vis. Media"},{"key":"ref_28","first-page":"1","article-title":"An introduction to sensor fusion","volume":"502","author":"Elmenreich","year":"2002","journal-title":"Vienna Univ. Technol. Austria"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Fung, M.L., Chen, M.Z., and Chen, Y.H. (2017, January 28\u201330). Sensor fusion: A review of methods and applications. Proceedings of the 2017 29th Chinese Control and Decision Conference (CCDC), Chongqing, China.","DOI":"10.1109\/CCDC.2017.7979175"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Yeong, D.J., Velasco-Hernandez, G., Barry, J., and Walsh, J. (2021). Sensor and sensor fusion technology in autonomous vehicles: A review. Sensors, 21.","DOI":"10.20944\/preprints202102.0459.v1"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Liu, H., Wu, C., and Wang, H. (2023). Real time object detection using LiDAR and camera fusion for autonomous driving. Sci. Rep., 13.","DOI":"10.1038\/s41598-023-35170-z"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Kumar, G.A., Lee, J.H., Hwang, J., Park, J., Youn, S.H., and Kwon, S. (2020). LiDAR and camera fusion approach for object distance estimation in self-driving vehicles. Symmetry, 12.","DOI":"10.3390\/sym12020324"},{"key":"ref_33","unstructured":"Farhadi, A., and Redmon, J. (2018). Yolov3: An incremental improvement. Computer Vision and Pattern Recognition, Springer."}],"container-title":["Information"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2078-2489\/16\/1\/4\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T17:00:08Z","timestamp":1760115608000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2078-2489\/16\/1\/4"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,12,25]]},"references-count":33,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2025,1]]}},"alternative-id":["info16010004"],"URL":"https:\/\/doi.org\/10.3390\/info16010004","relation":{},"ISSN":["2078-2489"],"issn-type":[{"value":"2078-2489","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,12,25]]}}}