{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T19:13:10Z","timestamp":1776107590890,"version":"3.50.1"},"reference-count":66,"publisher":"Association for Computing Machinery (ACM)","issue":"5","funder":[{"name":"Key Research and Development Program of Ningbo City","award":["2023Z062"],"award-info":[{"award-number":["2023Z062"]}]},{"name":"National Defense Science and Engineering Graduate (NDSEG) Fellowship Program"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["Proc. ACM Hum.-Comput. Interact."],"published-print":{"date-parts":[[2025,9,30]]},"abstract":"<jats:p>As Extended Reality (XR) advances, a device has the potential to be used across contexts from immersive productivity at a desk to on-the-go, public scenarios. Existing input solutions lack the versatility to provide both high-throughput, mouse-grade input and subtle, ergonomic interaction. We introduce FlowRing, a novel ring-form device that combines microgestures with precise 2D mouse-like input on surfaces. FlowRing supports five microgestures for discreet interaction and 2D input for richer tasks, using an optical flow sensor, skin-contact microphone, and IMU at the base of the finger. In a study with 11 participants, FlowRing achieved 93.6% microgesture recognition accuracy across sessions and 85.2% across unseen users, rising to 90.1% with just four gesture set examples from a new user. A separate 2D Fitts\u2019 law study demonstrated its effectiveness for continuous input on various surfaces. FlowRing emerges as a versatile, user-friendly solution for the future of interactive technology.<\/jats:p>","DOI":"10.1145\/3743706","type":"journal-article","created":{"date-parts":[[2025,9,9]],"date-time":"2025-09-09T14:28:48Z","timestamp":1757428128000},"page":"1-28","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":3,"title":["FlowRing: Integrated Microgesture and Surface Interaction Ring for Versatile XR Input MHCI010"],"prefix":"10.1145","volume":"9","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-2123-6392","authenticated-orcid":false,"given":"Ishan","family":"Chatterjee","sequence":"first","affiliation":[{"name":"Paul G. Allen School of Computer Science & Engineering","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]},{"ORCID":"https:\/\/orcid.org\/0009-0008-9599-1586","authenticated-orcid":false,"given":"Jiexin","family":"Ding","sequence":"additional","affiliation":[{"name":"Global Innovation Exchange","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3022-071X","authenticated-orcid":false,"given":"Anandghan","family":"Waghmare","sequence":"additional","affiliation":[{"name":"Paul G. Allen School for Computer Science & Engineering","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2795-7334","authenticated-orcid":false,"given":"Joseph","family":"Breda","sequence":"additional","affiliation":[{"name":"Paul G. Allen School for Computer Science & Engineering","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]},{"ORCID":"https:\/\/orcid.org\/0009-0000-0861-6000","authenticated-orcid":false,"given":"Yuquan","family":"Deng","sequence":"additional","affiliation":[{"name":"Paul G. Allen School for Computer Science & Engineering","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4331-1541","authenticated-orcid":false,"given":"Bo","family":"Liu","sequence":"additional","affiliation":[{"name":"Global Innovation Exchange","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4249-8893","authenticated-orcid":false,"given":"Yuntao","family":"Wang","sequence":"additional","affiliation":[{"name":"Department of Computer Science and Technology","place":["Beijing, China"]},{"name":"Tsinghua University","place":["Beijing, China"]}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-6300-4389","authenticated-orcid":false,"given":"Shwetak","family":"Patel","sequence":"additional","affiliation":[{"name":"Paul G. Allen School for Computer Science & Engineering","place":["Seattle, USA"]},{"name":"University of Washington","place":["Seattle, USA"]}]}],"member":"320","published-online":{"date-parts":[[2025,9,9]]},"reference":[{"key":"e_1_3_3_2_2","volume-title":"Enabling mobile microinteractions","author":"Ashbrook Daniel\u00a0L","year":"2010","unstructured":"Daniel\u00a0L Ashbrook. 2010. Enabling mobile microinteractions. Georgia Institute of Technology."},{"key":"e_1_3_3_3_2","unstructured":"Tomislav Bezmalinovic. 2024. Meta CTO confirms work on lightweight mixed reality glasses. https:\/\/mixed-news.com\/en\/meta-cto-puffin\/"},{"key":"e_1_3_3_4_2","first-page":"449","volume-title":"Human-computer interaction-INTERACT","author":"Buxton William","year":"1990","unstructured":"William Buxton et\u00a0al. 1990. A three-state model of graphical input. In Human-computer interaction-INTERACT , Vol.\u00a090. Citeseer, 449\u2013456."},{"key":"e_1_3_3_5_2","doi-asserted-by":"publisher","DOI":"10.1145\/108844.108874"},{"key":"e_1_3_3_6_2","unstructured":"Ashley Carman. 2019. North Focals glasses review: a 600 dollar smartwatch for your face. https:\/\/www.theverge.com\/2019\/2\/14\/18223593\/focals-smart-glasses-north-review-specs-features-price"},{"key":"e_1_3_3_7_2","doi-asserted-by":"publisher","DOI":"10.1145\/3544548.3580693"},{"key":"e_1_3_3_8_2","doi-asserted-by":"publisher","DOI":"10.1145\/2858036.2858589"},{"key":"e_1_3_3_9_2","doi-asserted-by":"publisher","DOI":"10.1145\/2807442.2807450"},{"key":"e_1_3_3_10_2","doi-asserted-by":"publisher","unstructured":"Taizhou Chen Tianpei Li Xingyu Yang and Kening Zhu. 2023. EFRing: Enabling Thumb-to-Index-Finger Microgesture Interaction through Electric Field Sensing Using Single Smart Ring. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 6 4 Article MHCI010 (jan 2023) 31\u00a0pages. 10.1145\/3569478","DOI":"10.1145\/3569478"},{"key":"e_1_3_3_11_2","doi-asserted-by":"publisher","DOI":"10.1145\/502348.502389"},{"key":"e_1_3_3_12_2","doi-asserted-by":"publisher","DOI":"10.1145\/3544548.3581037"},{"key":"e_1_3_3_13_2","doi-asserted-by":"publisher","DOI":"10.1145\/3411764.3445349"},{"key":"e_1_3_3_14_2","doi-asserted-by":"publisher","DOI":"10.1145\/3536221.3556589"},{"key":"e_1_3_3_15_2","doi-asserted-by":"publisher","DOI":"10.1145\/3379337.3415901"},{"key":"e_1_3_3_16_2","doi-asserted-by":"publisher","DOI":"10.1145\/3126594.3126615"},{"key":"e_1_3_3_17_2","doi-asserted-by":"publisher","DOI":"10.1145\/3332165.3347947"},{"key":"e_1_3_3_18_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-319-57987-0_41"},{"key":"e_1_3_3_19_2","volume-title":"Meta\u2019s \u2019Orion\u2019 Prototype AR Glasses Have 70 Degree FOV And A Wireless Compute Puck","author":"Heaney David","year":"2024","unstructured":"David Heaney. 2024. Meta\u2019s \u2019Orion\u2019 Prototype AR Glasses Have 70 Degree FOV And A Wireless Compute Puck. https:\/\/www.uploadvr.com\/meta-connect-2024-orion-prototype-ar-glasses\/"},{"key":"e_1_3_3_20_2","doi-asserted-by":"publisher","DOI":"10.1145\/1294211.1294258"},{"key":"e_1_3_3_21_2","doi-asserted-by":"publisher","DOI":"10.1145\/2858036.2858483"},{"key":"e_1_3_3_22_2","unstructured":"Apple Inc.[n. d.]. Apple Vision Pro. https:\/\/www.apple.com\/apple-vision-pro\/"},{"key":"e_1_3_3_23_2","doi-asserted-by":"publisher","DOI":"10.1145\/3544548.3580991"},{"key":"e_1_3_3_24_2","doi-asserted-by":"publisher","DOI":"10.1145\/3491101.3516396"},{"key":"e_1_3_3_25_2","unstructured":"Arjun Kharpal. 2024. Qualcomm says it\u2019s working on mixed reality smart glasses with Samsung and Google. https:\/\/www.cnbc.com\/2024\/09\/05\/qualcomm-working-on-mixed-reality-smart-glasses-with-google-samsung.html"},{"key":"e_1_3_3_26_2","doi-asserted-by":"publisher","DOI":"10.1145\/2642918.2647376"},{"key":"e_1_3_3_27_2","doi-asserted-by":"publisher","DOI":"10.1145\/3411764.3445094"},{"key":"e_1_3_3_28_2","doi-asserted-by":"publisher","DOI":"10.1145\/2380116.2380139"},{"key":"e_1_3_3_29_2","doi-asserted-by":"publisher","DOI":"10.1109\/VRW58643.2023.00147"},{"key":"e_1_3_3_30_2","doi-asserted-by":"publisher","DOI":"10.1145\/3613904.3642702"},{"key":"e_1_3_3_31_2","doi-asserted-by":"publisher","DOI":"10.1145\/3338286.3340142"},{"key":"e_1_3_3_32_2","doi-asserted-by":"crossref","unstructured":"Chen Liang Chi Hsia Chun Yu Yukang Yan Yuntao Wang and Yuanchun Shi. 2023. DRG-Keyboard: Enabling Subtle Gesture Typing on the Fingertip with Dual IMU Rings. Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies 6 4 (2023) 1\u201330.","DOI":"10.1145\/3569463"},{"key":"e_1_3_3_33_2","doi-asserted-by":"crossref","unstructured":"Chen Liang Chun Yu Yue Qin Yuntao Wang and Yuanchun Shi. 2021. DualRing: Enabling subtle and expressive hand interaction with dual IMU rings. Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies 5 3 (2021) 1\u201327.","DOI":"10.1145\/3478114"},{"key":"e_1_3_3_34_2","volume-title":"Proc. MobileHCI","author":"Loclair Christian","year":"2010","unstructured":"Christian Loclair, Sean Gustafson, and Patrick Baudisch. 2010. PinchWatch: a wearable device for one-handed microinteractions. In Proc. MobileHCI , Vol.\u00a010. Citeseer."},{"key":"e_1_3_3_35_2","unstructured":"Wayne Ma and Qianer Liu. 2024. https:\/\/www.theinformation.com\/articles\/apple-suspends-work-on-next-high-end-headset-focused-on-releasing-cheaper-model-in-late-2025?rc=o8rewz"},{"key":"e_1_3_3_36_2","unstructured":"Meta. [n. d.]. Meta Quest Pro. https:\/\/www.meta.com\/quest\/quest-pro\/"},{"key":"e_1_3_3_37_2","doi-asserted-by":"publisher","DOI":"10.1145\/1476589.1476628"},{"key":"e_1_3_3_38_2","doi-asserted-by":"publisher","DOI":"10.1109\/VR.2015.7223451"},{"key":"e_1_3_3_39_2","doi-asserted-by":"publisher","DOI":"10.1145\/3334480.3383098"},{"key":"e_1_3_3_40_2","doi-asserted-by":"publisher","DOI":"10.1145\/3625008.3625046"},{"key":"e_1_3_3_41_2","doi-asserted-by":"publisher","DOI":"10.1145\/3242587.3242664"},{"key":"e_1_3_3_42_2","unstructured":"Rokid. [n. d.]. Rokid AR JOY. https:\/\/global.rokid.com\/pages\/rokid-ar-joy"},{"key":"e_1_3_3_43_2","doi-asserted-by":"publisher","DOI":"10.1145\/3613904.3642225"},{"key":"e_1_3_3_44_2","doi-asserted-by":"crossref","unstructured":"Yilei Shi Haimo Zhang Kaixing Zhao Jiashuo Cao Mengmeng Sun and Suranga Nanayakkara. 2020. Ready steady touch! sensing physical contact with a finger-mounted IMU. Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies 4 2 (2020) 1\u201325.","DOI":"10.1145\/3397309"},{"key":"e_1_3_3_45_2","doi-asserted-by":"publisher","unstructured":"Roy Shilkrot Jochen Huber J\u00fcrgen Steimle Suranga Nanayakkara and Pattie Maes. 2015. Digital Digits: A Comprehensive Survey of Finger Augmentation Devices. ACM Comput. Surv. 48 2 Article MHCI010 (nov 2015) 29\u00a0pages. 10.1145\/2828993","DOI":"10.1145\/2828993"},{"key":"e_1_3_3_46_2","unstructured":"Snapchat. [n. d.]. Spectacles by Snap Inc. - The Next Generation of Spectacles. https:\/\/www.spectacles.com\/"},{"key":"e_1_3_3_47_2","doi-asserted-by":"publisher","unstructured":"R.\u00a0William Soukoreff and I.\u00a0Scott MacKenzie. 2004. Towards a Standard for Pointing Device Evaluation Perspectives on 27 Years of Fitts\u2019 Law Research in HCI. Int. J. Hum.-Comput. Stud. 61 6 (dec 2004) 751\u2013789. 10.1016\/j.ijhcs.2004.09.001","DOI":"10.1016\/j.ijhcs.2004.09.001"},{"key":"e_1_3_3_48_2","first-page":"506","volume-title":"Proceedings of the IFIP Congress","author":"Sutherland Ivan\u00a0E","year":"1965","unstructured":"Ivan\u00a0E Sutherland et\u00a0al. 1965. The ultimate display. In Proceedings of the IFIP Congress , Vol.\u00a02. New York, 506\u2013508."},{"key":"e_1_3_3_49_2","doi-asserted-by":"publisher","DOI":"10.1145\/2957265.2961860"},{"key":"e_1_3_3_50_2","doi-asserted-by":"publisher","DOI":"10.1145\/2957265.2961859"},{"key":"e_1_3_3_51_2","doi-asserted-by":"publisher","DOI":"10.1145\/3472749.3474780"},{"key":"e_1_3_3_52_2","doi-asserted-by":"publisher","DOI":"10.1145\/3544548.3581422"},{"key":"e_1_3_3_53_2","doi-asserted-by":"publisher","DOI":"10.1145\/3607822.3614538"},{"key":"e_1_3_3_54_2","doi-asserted-by":"crossref","unstructured":"Mark Weiser. 1999. The computer for the 21st century. ACM SIGMOBILE mobile computing and communications review 3 3 (1999) 3\u201311.","DOI":"10.1145\/329124.329126"},{"key":"e_1_3_3_55_2","doi-asserted-by":"publisher","DOI":"10.1145\/3313831.3376249"},{"key":"e_1_3_3_56_2","doi-asserted-by":"publisher","DOI":"10.1145\/1978942.1979181"},{"key":"e_1_3_3_57_2","doi-asserted-by":"crossref","unstructured":"Katrin Wolf. 2016. Microgestures\u2014enabling gesture input with busy hands. Peripheral Interaction: Challenges and Opportunities for HCI in the Periphery of Attention (2016) 95\u2013116.","DOI":"10.1007\/978-3-319-29523-7_5"},{"key":"e_1_3_3_58_2","doi-asserted-by":"publisher","unstructured":"Robert Xiao Julia Schwarz Nick Throm Andrew\u00a0D. Wilson and Hrvoje Benko. 2018. MRTouch: Adding Touch Input to Head-Mounted Mixed Reality. IEEE Transactions on Visualization and Computer Graphics 24 4 (2018) 1653\u20131660. 10.1109\/TVCG.2018.2794222","DOI":"10.1109\/TVCG.2018.2794222"},{"key":"e_1_3_3_59_2","unstructured":"XReal. [n. d.]. XReal Air. https:\/\/www.xreal.com\/air\/"},{"key":"e_1_3_3_60_2","unstructured":"XREAL Inc.[n. d.]. XREAL Air 2 Pro. https:\/\/us.shop.xreal.com\/products\/xreal-air-2-pro."},{"key":"e_1_3_3_61_2","doi-asserted-by":"publisher","DOI":"10.1145\/3544548.3581264"},{"key":"e_1_3_3_62_2","doi-asserted-by":"publisher","DOI":"10.1145\/3332165.3347865"},{"key":"e_1_3_3_63_2","doi-asserted-by":"publisher","DOI":"10.1145\/2380116.2380137"},{"key":"e_1_3_3_64_2","doi-asserted-by":"publisher","unstructured":"Cheng Zhang Anandghan Waghmare Pranav Kundra Yiming Pu Scott Gilliland Thomas Ploetz Thad\u00a0E. Starner Omer\u00a0T. Inan and Gregory\u00a0D. Abowd. 2017. FingerSound: Recognizing Unistroke Thumb Gestures Using a Ring. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1 3 Article MHCI010 (sep 2017) 19\u00a0pages. 10.1145\/3130985","DOI":"10.1145\/3130985"},{"key":"e_1_3_3_65_2","doi-asserted-by":"publisher","DOI":"10.1145\/3173574.3174011"},{"key":"e_1_3_3_66_2","doi-asserted-by":"publisher","DOI":"10.1145\/3025453.3025842"},{"key":"e_1_3_3_67_2","first-page":"1","volume-title":"Proceedings of the 2018 chi conference on human factors in computing systems","author":"Zhang Yang","year":"2018","unstructured":"Yang Zhang, Chouchang Yang, Scott\u00a0E Hudson, Chris Harrison, and Alanson Sample. 2018. Wall++ room-scale interactive and context-aware sensing. In Proceedings of the 2018 chi conference on human factors in computing systems. 1\u201315."}],"container-title":["Proceedings of the ACM on Human-Computer Interaction"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3743706","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,9,10]],"date-time":"2025-09-10T16:09:32Z","timestamp":1757520572000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3743706"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,9,9]]},"references-count":66,"journal-issue":{"issue":"5","published-print":{"date-parts":[[2025,9,30]]}},"alternative-id":["10.1145\/3743706"],"URL":"https:\/\/doi.org\/10.1145\/3743706","relation":{},"ISSN":["2573-0142"],"issn-type":[{"value":"2573-0142","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,9,9]]},"assertion":[{"value":"2025-09-09","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}