{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,11,19]],"date-time":"2025-11-19T07:05:41Z","timestamp":1763535941314,"version":"3.37.3"},"reference-count":47,"publisher":"Springer Science and Business Media LLC","license":[{"start":{"date-parts":[[2022,2,22]],"date-time":"2022-02-22T00:00:00Z","timestamp":1645488000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,2,22]],"date-time":"2022-02-22T00:00:00Z","timestamp":1645488000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100001809","name":"national natural science foundation of china","doi-asserted-by":"publisher","award":["51975051"],"award-info":[{"award-number":["51975051"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Vis Comput"],"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Accurate and informative hand-object collision feedback is of vital importance for hand manipulation in virtual reality (VR). However, to our best knowledge, the hand movement performance in fully-occluded and confined VR spaces under visual collision feedback is still under investigation. In this paper, we firstly studied the effects of several popular visual feedback of hand-object collision on hand movement performance. To test the effects, we conducted a within-subject user study (<jats:italic>n<\/jats:italic>=18) using a target-reaching task in a confined box. Results indicated that users had the best task performance with see-through visualization, and the most accurate movement with the hybrid of proximity-based gradation and deformation. By further analysis, we concluded that the integration of see-through visualization and proximity-based visual cue could be the best compromise between the speed and accuracy for hand movement in the enclosed VR space. On the basis, we designed a visual collision feedback based on projector decal,which incorporates the advantages of see-through and color gradation. In the end, we present demos of potential usage of the proposed visual cue.<\/jats:p>","DOI":"10.1007\/s00371-022-02424-2","type":"journal-article","created":{"date-parts":[[2022,2,22]],"date-time":"2022-02-22T10:02:59Z","timestamp":1645524179000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":10,"title":["Using visual feedback to improve hand movement accuracy in confined-occluded spaces in virtual reality"],"prefix":"10.1007","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-6318-0446","authenticated-orcid":false,"given":"Yu","family":"Wang","sequence":"first","affiliation":[]},{"given":"Ziran","family":"Hu","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9947-6049","authenticated-orcid":false,"given":"Shouwen","family":"Yao","sequence":"additional","affiliation":[]},{"given":"Hui","family":"Liu","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,2,22]]},"reference":[{"issue":"3","key":"2424_CR1","doi-asserted-by":"publisher","first-page":"265","DOI":"10.1111\/j.1475-1313.2007.00476.x","volume":"27","author":"C Gonz\u00e1lez-Alvarez","year":"2007","unstructured":"Gonz\u00e1lez-Alvarez, C., Subramanian, A., Pardhan, S.: Reaching and grasping with restricted peripheral vision. Ophthalmic Physiol. Opt. 27(3), 265\u2013274 (2007)","journal-title":"Ophthalmic Physiol. Opt."},{"issue":"3","key":"2424_CR2","doi-asserted-by":"publisher","first-page":"305","DOI":"10.1162\/PRES_a_00115","volume":"21","author":"MJ Fu","year":"2012","unstructured":"Fu, M.J., Hershberger, A.D., Sano, K., \u00c7avu\u015fo\u011flu, M.C.: Effect of visuomotor colocation on 3d fitts\u2019 task performance in physical and virtual environments. Presence 21(3), 305\u2013320 (2012)","journal-title":"Presence"},{"key":"2424_CR3","unstructured":"Valentina, G.: The role of peripheral visual cues in planning and controlling movement:| ban investigation of which cues provided by different parts of the visual field influence the execution of movement and how they work to control upper and lower limb motion. PhD Thesis, University of Bradford. See also http:\/\/hdl.handle.net\/10454\/5715 (2013)"},{"key":"2424_CR4","doi-asserted-by":"crossref","unstructured":"Liu, L., van Liere, R., Nieuwenhuizen, C., Martens, J.-B.: Comparing aimed movements in the real world and in virtual reality. In 2009 IEEE Virtual Reality Conference, IEEE, pp.\u00a0219\u2013222 (2009)","DOI":"10.1109\/VR.2009.4811026"},{"key":"2424_CR5","doi-asserted-by":"crossref","unstructured":"Chapoulie, E., Tsandilas, T., Oehlberg, L., Mackay, W., Drettakis, G.: Finger-based manipulation in immersive spaces and the real world. In 2015 IEEE Symposium on 3D User Interfaces (3DUI), IEEE, pp.\u00a0109\u2013116 (2015)","DOI":"10.1109\/3DUI.2015.7131734"},{"key":"2424_CR6","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/j.humov.2014.11.009","volume":"40","author":"L Zhang","year":"2015","unstructured":"Zhang, L., Yang, J., Inai, Y., Huang, Q., Wu, J.: Effects of aging on pointing movements under restricted visual feedback conditions. Hum. Mov. Sci. 40, 1\u201313 (2015)","journal-title":"Hum. Mov. Sci."},{"key":"2424_CR7","doi-asserted-by":"publisher","first-page":"102515","DOI":"10.1016\/j.humov.2019.102515","volume":"67","author":"JD Bell","year":"2019","unstructured":"Bell, J.D., Macuga, K.L.: Goal-directed aiming under restricted viewing conditions with confirmatory sensory feedback. Hum. Mov. Sci. 67, 102515 (2019)","journal-title":"Hum. Mov. Sci."},{"key":"2424_CR8","doi-asserted-by":"crossref","unstructured":"Bloomfield, A., Badler, N.\u00a0I.: Collision awareness using vibrotactile arrays. In 2007 IEEE Virtual Reality Conference, IEEE, pp.\u00a0163\u2013170 (2007)","DOI":"10.1109\/VR.2007.352477"},{"issue":"11","key":"2424_CR9","doi-asserted-by":"publisher","first-page":"1015","DOI":"10.1080\/10447318.2017.1411665","volume":"34","author":"C Louison","year":"2018","unstructured":"Louison, C., Fabien, F., Daniel, R.M.: Spatialized vibrotactile feedback improves goal-directed movements in cluttered virtual environments. Int. J. Human-Comput. Int. 34(11), 1015\u20131031 (2018)","journal-title":"Int. J. Human-Comput. Int."},{"key":"2424_CR10","doi-asserted-by":"crossref","unstructured":"Sagardia, M., Hulin, T., Hertkorn, K., Kremer, P., Sch\u00e4tzle, S.: A platform for bimanual virtual assembly training with haptic feedback in large multi-object environments. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology, Association for Computing Machinery, pp.\u00a0153\u2013162 (2016)","DOI":"10.1145\/2993369.2993386"},{"issue":"1","key":"2424_CR11","doi-asserted-by":"publisher","first-page":"379","DOI":"10.1007\/s00170-011-3381-8","volume":"58","author":"P Xia","year":"2012","unstructured":"Xia, P., Lopes, A.M., Restivo, M.T., Yao, Y.: A new type haptics-based virtual environment system for assembly training of complex products. Int. J. Adv. Manuf. Technol 58(1), 379\u2013396 (2012)","journal-title":"Int. J. Adv. Manuf. Technol"},{"key":"2424_CR12","doi-asserted-by":"crossref","unstructured":"Lindeman, R.\u00a0W., Page, R., Yanagida, Y., Sibert, J.\u00a0L.: Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system. In Proceedings of the ACM symposium on Virtual reality software and technology, pp.\u00a0146\u2013149 (2004)","DOI":"10.1145\/1077534.1077562"},{"key":"2424_CR13","first-page":"241","volume-title":"Int. Conf. Virtual","author":"B Weber","year":"2013","unstructured":"Weber, B., Sagardia, M., Hulin, T., Preusche, C.: Visual, vibrotactile, and force feedback of collisions in virtual environments: effects on performance, mental workload and spatial orientation. In: Int. Conf. Virtual, pp. 241\u2013250. Springer, Augmented and Mixed Reality (2013)"},{"key":"2424_CR14","doi-asserted-by":"publisher","first-page":"323","DOI":"10.1016\/j.apergo.2015.06.024","volume":"53","author":"G Lawson","year":"2016","unstructured":"Lawson, G., Salanitri, D., Waterfield, B.: Future directions for the development of virtual reality within an automotive manufacturer. Appl. Ergon. 53, 323\u2013330 (2016)","journal-title":"Appl. Ergon."},{"key":"2424_CR15","doi-asserted-by":"crossref","unstructured":"Zimmermann, P.: Virtual reality aided design. a survey of the use of vr in automotive industry. Product Engineering, pp.\u00a0277\u2013296 (2008)","DOI":"10.1007\/978-1-4020-8200-9_13"},{"issue":"5","key":"2424_CR16","doi-asserted-by":"publisher","first-page":"1013","DOI":"10.1109\/TVCG.2006.189","volume":"12","author":"J Sreng","year":"2006","unstructured":"Sreng, J., L\u00e9cuyer, A., M\u00e9gard, C., Andriot, C.: Using visual cues of contact to improve interactive manipulation of virtual objects in industrial assembly\/maintenance simulations. IEEE Trans. Visual Comput. Graphics 12(5), 1013\u20131020 (2006)","journal-title":"IEEE Trans. Visual Comput. Graphics"},{"key":"2424_CR17","doi-asserted-by":"crossref","unstructured":"Prachyabrued, M., Borst, C.\u00a0W.: Visual feedback for virtual grasping. In 2014 IEEE Symposium on 3D User Interfaces (3DUI), IEEE, pp.\u00a019\u201326 (2014)","DOI":"10.1109\/3DUI.2014.6798835"},{"issue":"1","key":"2424_CR18","doi-asserted-by":"publisher","first-page":"47","DOI":"10.1007\/s10055-017-0313-4","volume":"22","author":"S Vosinakis","year":"2018","unstructured":"Vosinakis, S., Koutsabasis, P.: Evaluation of visual feedback techniques for virtual grasping with bare hands using leap motion and oculus rift. Virtual Reality 22(1), 47\u201362 (2018)","journal-title":"Virtual Reality"},{"key":"2424_CR19","doi-asserted-by":"crossref","unstructured":"Samad, M., Gatti, E., Hermes, A., Benko, H., Parise, C.:Pseudo-haptic weight: changing the perceived weight of virtual objects by manipulating control-display ratio. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp.\u00a01\u201313 (2019)","DOI":"10.1145\/3290605.3300550"},{"key":"2424_CR20","doi-asserted-by":"crossref","unstructured":"Vance, J.\u00a0M., Dumont, G.: A conceptual framework to support natural interaction for virtual assembly tasks. In World Conference on Innovative Virtual Reality, Vol.\u00a044328, American Society of Mechanical Engineers, pp.\u00a0273\u2013278 (2011)","DOI":"10.1115\/WINVR2011-5570"},{"key":"2424_CR21","doi-asserted-by":"crossref","unstructured":"Sarlegna, F.\u00a0R., Sainburg, R.\u00a0L.: The roles of vision and proprioception in the planning of reaching movements. Progress in motor control, pp.\u00a0317\u2013335 (2009)","DOI":"10.1007\/978-0-387-77064-2_16"},{"key":"2424_CR22","volume-title":"Motor control: translating research into clinical practice","author":"A Shumway-Cook","year":"2007","unstructured":"Shumway-Cook, A., Woollacott, M.H.: Motor control: translating research into clinical practice. Lippincott Williams & Wilkins, US (2007)"},{"key":"2424_CR23","doi-asserted-by":"crossref","unstructured":"Waller, D., Hodgson, E.: Sensory contributions to spatial knowledge of real and virtual environments. In Human walking in virtual environments. Springer, pp.\u00a03\u201326 (2013)","DOI":"10.1007\/978-1-4419-8432-6_1"},{"key":"2424_CR24","unstructured":"Ng, A.W., Chan, A.H.: Finger response times to visual, auditory and tactile modality stimuli. In Proceedings of the International Multiconference of Engineers and Computer Scientists 2, 1449\u20131454 (2012)"},{"key":"2424_CR25","doi-asserted-by":"crossref","unstructured":"Fan, H., Zhuo, T., Yu, X., Yang, Y., Kankanhalli, M.: Understanding atomic hand-object interaction with human intention. IEEE Trans. Circuits. Syst. Video Technol. (2021)","DOI":"10.1109\/TCSVT.2021.3058688"},{"issue":"5","key":"2424_CR26","doi-asserted-by":"publisher","first-page":"1095","DOI":"10.1109\/TVCG.2008.59","volume":"14","author":"N Elmqvist","year":"2008","unstructured":"Elmqvist, N., Tsigas, P.: A taxonomy of 3d occlusion management for visualization. IEEE Trans. Visual Comput. Graphics 14(5), 1095\u20131109 (2008)","journal-title":"IEEE Trans. Visual Comput. Graphics"},{"key":"2424_CR27","doi-asserted-by":"publisher","first-page":"107","DOI":"10.1016\/j.cognition.2017.03.024","volume":"166","author":"T Iachini","year":"2017","unstructured":"Iachini, T., Ruotolo, F., Vinciguerra, M., Ruggiero, G.: Manipulating time and space: collision prediction in peripersonal and extrapersonal space. Cognition 166, 107\u2013117 (2017)","journal-title":"Cognition"},{"key":"2424_CR28","doi-asserted-by":"crossref","unstructured":"Fricke, N., Th\u00fcring, M.: Complementary audio-visual collision warnings. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol.\u00a053, SAGE Publications Sage CA: Los Angeles, CA, pp.\u00a01815\u20131819 (2009)","DOI":"10.1177\/154193120905302315"},{"key":"2424_CR29","doi-asserted-by":"crossref","unstructured":"Biocca, F., Kim, J., Choi, Y.: Visual touch in virtual environments: an exploratory study of presence, multimodal interfaces, and cross-modal sensory illusions. Presence: Teleoperators & Virtual Environments, 10(3), pp.\u00a0247\u2013265 (2001)","DOI":"10.1162\/105474601300343595"},{"key":"2424_CR30","doi-asserted-by":"crossref","unstructured":"Cortes, G., Argelaguet, F., Marchand, E., L\u00e9cuyer, A.: Virtual shadows for real humans in a cave: Influence on virtual embodiment and 3d interaction. In Proceedings of the 15th ACM Symposium on Applied Perception, pp.\u00a01\u20138 (2018)","DOI":"10.1145\/3225153.3225165"},{"issue":"3","key":"2424_CR31","doi-asserted-by":"publisher","first-page":"405","DOI":"10.1007\/s00371-016-1346-5","volume":"34","author":"MT Eren","year":"2018","unstructured":"Eren, M.T., Balcisoy, S.: Evaluation of x-ray visualization techniques for vertical depth judgments in underground exploration. Vis. Comput. 34(3), 405\u2013416 (2018)","journal-title":"Vis. Comput."},{"key":"2424_CR32","doi-asserted-by":"crossref","unstructured":"Ariza, O., Bruder, G., Katzakis, N., Steinicke, F.: Analysis of proximity-based multimodal feedback for 3d selection in immersive virtual environments. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), IEEE, pp.\u00a0327\u2013334 (2018)","DOI":"10.1109\/VR.2018.8446317"},{"issue":"4","key":"2424_CR33","first-page":"148","volume":"4","author":"S Dey","year":"2015","unstructured":"Dey, S., Kapoor, A.: Hand length and hand breadth: a study of correlation statistics among human population. Int. J. Sci. Res. 4(4), 148\u2013150 (2015)","journal-title":"Int. J. Sci. Res."},{"key":"2424_CR34","doi-asserted-by":"crossref","unstructured":"Andreas, B., L, N.\u00a0S., Mikkel, G., D, L.\u00a0B., J\u00f8rgen, C.\u00a0N.: Single-pass wireframe rendering. In ACM SIGGRAPH 2006 Sketches. pp.\u00a0149\u2013es (2006)","DOI":"10.1145\/1179849.1180035"},{"key":"2424_CR35","unstructured":"UltraLeap, 2017. Hand \u2014 leap motion c# sdk v2.3 documentation. On the WWW, December. https:\/\/developer-archive.leapmotion.com\/documentation\/v2\/csharp\/api\/Leap.Hand.html?proglang=csharp"},{"key":"2424_CR36","unstructured":"UltraLeap, 2010. Unity modules of leap motion. On the WWW. URL https:\/\/leapmotion.github.io\/UnityModules\/"},{"key":"2424_CR37","unstructured":"UltraLeap, 2020. Unity modules leap motion\u2019s unity sdk 4.5.0: Interaction engine, leap motion. On the WWW, May. URL https:\/\/leapmotion.github.io\/UnityModules\/interaction-engine.html"},{"key":"2424_CR38","unstructured":"F, B., Yasuhiro, I., Andy, L., Heather, P., Arthur, T.: Visual cues and virtual touch: Role of visual stimuli and intersensory integration in cross-modal haptic illusions and the sense of presence. Proceedings of presence, pp.\u00a0410\u2013428 (2002)"},{"issue":"2","key":"2424_CR39","doi-asserted-by":"publisher","first-page":"113","DOI":"10.1162\/1054746041382393","volume":"13","author":"S Zhai","year":"2004","unstructured":"Zhai, S., Accot, J., Woltjer, R.: Human action laws in electronic virtual worlds: an empirical study of path steering performance in vr. Presence 13(2), 113\u2013127 (2004)","journal-title":"Presence"},{"issue":"1","key":"2424_CR40","doi-asserted-by":"publisher","first-page":"91","DOI":"10.1207\/s15327051hci0701_3","volume":"7","author":"IS MacKenzie","year":"1992","unstructured":"MacKenzie, I.S.: Fitts\u2019 law as a research and design tool in human-computer interaction. Human-Comput. Interact. 7(1), 91\u2013139 (1992)","journal-title":"Human-Comput. Interact."},{"issue":"1","key":"2424_CR41","doi-asserted-by":"publisher","first-page":"18","DOI":"10.1109\/TOH.2019.2961883","volume":"13","author":"T Kawabe","year":"2019","unstructured":"Kawabe, T.: Mid-air action contributes to pseudo-haptic stiffness effects. IEEE Trans. Haptics 13(1), 18\u201324 (2019)","journal-title":"IEEE Trans. Haptics"},{"issue":"11","key":"2424_CR42","doi-asserted-by":"publisher","first-page":"1177","DOI":"10.1007\/s11517-015-1309-4","volume":"53","author":"M Li","year":"2015","unstructured":"Li, M., Konstantinova, J., Secco, E.L., Jiang, A., Liu, H., Nanayakkara, T., Seneviratne, L.D., Dasgupta, P., Althoefer, K., Wurdemann, H.A.: Using visual cues to enhance haptic feedback for palpation on virtual model of soft tissue. Med. Biol. Eng. Comput. 53(11), 1177\u20131186 (2015)","journal-title":"Med. Biol. Eng. Comput."},{"key":"2424_CR43","doi-asserted-by":"crossref","unstructured":"Argelaguet, F., Hoyet, L., Trico, M., L\u00e9cuyer, A.: The role of interaction in virtual embodiment: Effects of the virtual hand representation. In 2016 IEEE Virtual Reality (VR), IEEE, pp.\u00a03\u201310 (2016)","DOI":"10.1109\/VR.2016.7504682"},{"key":"2424_CR44","doi-asserted-by":"crossref","unstructured":"Schwind, V., Lin, L., Di\u00a0Luca, M., J\u00f6rg, S., Hillis, J.: Touch with foreign hands: The effect of virtual hand appearance on visual-haptic integration. In Proceedings of the 15th ACM Symposium on Applied Perception, pp.\u00a01\u20138 (2018)","DOI":"10.1145\/3225153.3225158"},{"issue":"5","key":"2424_CR45","doi-asserted-by":"publisher","first-page":"735","DOI":"10.1017\/S0140525X99002186","volume":"22","author":"Z Dienes","year":"1999","unstructured":"Dienes, Z., Perner, J.: A theory of implicit and explicit knowledge. Behav. Brain Sci. 22(5), 735\u2013808 (1999)","journal-title":"Behav. Brain Sci."},{"key":"2424_CR46","unstructured":"Greene, E.\u00a0D.: Augmenting visual feedback using sensory substitution. Master\u2019s thesis, University of Waterloo (2011)"},{"key":"2424_CR47","unstructured":"Reddy, G.\u00a0R., Rompapas, D.\u00a0C.: Visuotouch: Enabling haptic feedback in augmented reality through visual cues. In IEEE International Symposium on Mixed and Augmented Reality (ISMAR) (2020)"}],"container-title":["The Visual Computer"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s00371-022-02424-2.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s00371-022-02424-2\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s00371-022-02424-2.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,9,19]],"date-time":"2024-09-19T00:30:54Z","timestamp":1726705854000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s00371-022-02424-2"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,2,22]]},"references-count":47,"alternative-id":["2424"],"URL":"https:\/\/doi.org\/10.1007\/s00371-022-02424-2","relation":{},"ISSN":["0178-2789","1432-2315"],"issn-type":[{"type":"print","value":"0178-2789"},{"type":"electronic","value":"1432-2315"}],"subject":[],"published":{"date-parts":[[2022,2,22]]},"assertion":[{"value":"20 January 2022","order":1,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"22 February 2022","order":2,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declaration"}},{"value":"The authors have no conflicts of interest to declare that are relevant to the content of this article.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"Written informed consent was obtained from individual or guardian participants.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent to participate"}}]}}