{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T16:47:53Z","timestamp":1776098873252,"version":"3.50.1"},"reference-count":42,"publisher":"ASME International","issue":"1","license":[{"start":{"date-parts":[[2020,7,24]],"date-time":"2020-07-24T00:00:00Z","timestamp":1595548800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.asme.org\/publications-submissions\/publishing-information\/legal-policies"}],"content-domain":{"domain":["asmedigitalcollection.asme.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2021,2,1]]},"abstract":"<jats:title>Abstract<\/jats:title>\n               <jats:p>Drawing curves is a fundamental task in mid-air interactive applications such as 3D sketching, geometric modeling, hand-writing recognition, and authentication. Existing research in mid-air drawing is solely focused on determining what the user drew assuming that the intended curve is segmented from the continuous user-generated trajectory. In this work, our aim is to address the complementary problem: to determine when the user actually intended to draw without the use of any prescribed gestures or hand-held controllers (e.g., Wii remote, HTC Vive). In our previously published work, we demonstrated that in mid-air drawing tasks, not only it is possible to statistically learn drawing intent from hand motion, but it is also perceived to be more natural by users. Our idea was to simply classify each instance of hand trajectories as either a stroke or a hover. Our current work investigates new representations of the users\u2019 motion beyond a single point (such as a tracked palm) to richer multi-point trajectories obtained with other skeletal joints such as wrist and elbow. We trained several binary classifiers on five such trajectory representations obtained from 3D drawing data from 25 users using a hand tracking device. We compare these representations and the corresponding classifiers for predicting user intent for mid-air drawing. Our extended approach resulted in improved prediction accuracy (mean: 80.17%, min: 79.92%, max: 91.30%) with respect to our earlier work (mean: 76.75%, min: 74.23%, max: 84.01%).<\/jats:p>","DOI":"10.1115\/1.4047558","type":"journal-article","created":{"date-parts":[[2020,6,19]],"date-time":"2020-06-19T14:49:34Z","timestamp":1592578174000},"update-policy":"https:\/\/doi.org\/10.1115\/crossmarkpolicy-asme","source":"Crossref","is-referenced-by-count":4,"title":["Stroke-Hover Intent Recognition for Mid-Air Curve Drawing Using Multi-Point Skeletal Trajectories"],"prefix":"10.1115","volume":"21","author":[{"given":"Umema H.","family":"Bohari","sequence":"first","affiliation":[{"name":"Schlumberger Corporation, 14910 Airline Road, Rosharon, TX 77583"}]},{"given":"Ryan","family":"Alli","sequence":"additional","affiliation":[{"name":"J. Mike Walker \u201866 Department of Mechanical Engineering, Texas A&M University, College Station, TX 77843"}]},{"given":"Alejandra","family":"Garcia","sequence":"additional","affiliation":[{"name":"J. Mike Walker \u201866 Department of Mechanical Engineering, Texas A&M University, College Station, TX 77843"}]},{"given":"Vinayak R.","family":"Krishnamurthy","sequence":"additional","affiliation":[{"name":"J. Mike Walker \u201866 Department of Mechanical Engineering, Texas A&M University, College Station, TX 77843"}]}],"member":"33","published-online":{"date-parts":[[2020,7,24]]},"reference":[{"key":"2021011101554230700_CIT0001","first-page":"1","article-title":"Sketching in the Air: A Vision-Based System for 3D Object Design","author":"Chen","year":"2008"},{"key":"2021011101554230700_CIT0002","first-page":"5643","article-title":"Experimental Evaluation of Sketching on Surfaces in VR","author":"Arora","year":"2017"},{"key":"2021011101554230700_CIT0003","first-page":"339","article-title":"Intelligent Sketching Interfaces for Richer Mid-Air Drawing Interactions","author":"Taele","year":"2014"},{"key":"2021011101554230700_CIT0004","first-page":"311","article-title":"Mid-Air Authentication Gestures: An Exploration of Authentication Based on Palm and Finger Motions","author":"Aslan","year":"2014"},{"key":"2021011101554230700_CIT0005","first-page":"217","article-title":"Vision-Based Handwriting Recognition for Unrestricted Text Input in Mid-Air","author":"Schick","year":"2012"},{"key":"2021011101554230700_CIT0006","first-page":"1179","article-title":"Writing and Sketching in the Air, Recognizing and Controlling on the Fly","author":"Vikram","year":"2013"},{"key":"2021011101554230700_CIT0007","first-page":"539","article-title":"Segmentation and Recognition of Text Written in 3d Using Leap Motion Interface","author":"Agarwal","year":"2015"},{"key":"2021011101554230700_CIT0008","first-page":"5850","article-title":"Jackknife: A Reliable Recognizer With Few Samples and Many Modalities","author":"Taranta II","year":"2017"},{"key":"2021011101554230700_CIT0009","first-page":"177","article-title":"To Draw Or Not to Draw: Recognizing Stroke-Hover Intent in Non-Instrumented Gesture-Free Mid-Air Sketching","author":"Bohari","year":"2018"},{"issue":"8","key":"2021011101554230700_CIT0010","doi-asserted-by":"crossref","first-page":"2516","DOI":"10.1111\/j.1467-8659.2012.03224.x","article-title":"Smart Scribbles for Sketch Segmentation","volume":"31","author":"Noris","year":"2012","journal-title":"Comput. Graphics Forum"},{"key":"2021011101554230700_CIT0011","first-page":"1","article-title":"Paleosketch: Accurate Primitive Sketch Recognition and Beautification","author":"Paulson","year":"2008"},{"issue":"5","key":"2021011101554230700_CIT0012","doi-asserted-by":"crossref","first-page":"499","DOI":"10.1016\/j.cag.2010.07.001","article-title":"The Effect of Task on Classification Accuracy: Using Gesture Recognition Techniques in Free-Sketch Recognition","volume":"34","author":"Field","year":"2010","journal-title":"Comput. Graph."},{"key":"2021011101554230700_CIT0013","first-page":"151","article-title":"Ilovesketch: As-Natural-as-Possible Sketching System for Creating 3d Curve Models","author":"Bae","year":"2008"},{"key":"2021011101554230700_CIT0014","first-page":"245","article-title":"A Lightweight Multistroke Recognizer for User Interface Prototypes","author":"Anthony","year":"2010"},{"key":"2021011101554230700_CIT0015","first-page":"117","article-title":"$ N-Protractor: A Fast and Accurate Multistroke Recognizer","author":"Anthony","year":"2012"},{"key":"2021011101554230700_CIT0016","first-page":"159","article-title":"Gestures Without Libraries, Toolkits Or Training: A $1 Recognizer for User Interface Prototypes","author":"Wobbrock","year":"2007"},{"key":"2021011101554230700_CIT0017","first-page":"273","article-title":"Gestures As Point Clouds: A $p Recognizer for User Interface Prototypes","author":"Vatavu","year":"2012"},{"key":"2021011101554230700_CIT0018","first-page":"3105","article-title":"Dynamic Hand Pose Recognition Using Depth Data","author":"Suryanarayan","year":"2010"},{"issue":"6","key":"2021011101554230700_CIT0019","doi-asserted-by":"crossref","first-page":"1225","DOI":"10.1016\/j.patcog.2010.11.006","article-title":"Sketch Recognition by Fusion of Temporal and Image-Based Features","volume":"44","author":"Arandjelovi\u0107","year":"2011","journal-title":"Pattern Recognit."},{"issue":"12","key":"2021011101554230700_CIT0020","doi-asserted-by":"crossref","first-page":"3303","DOI":"10.1016\/j.patcog.2009.01.030","article-title":"Iconic and Multi-Stroke Gesture Recognition","volume":"42","author":"Willems","year":"2009","journal-title":"Pattern Recognit."},{"key":"2021011101554230700_CIT0021","doi-asserted-by":"crossref","DOI":"10.1115\/DETC2018-85867","article-title":"Virtual Reality Applications: Guidelines to Design Natural User Interface","author":"Regazzoni","year":"2018"},{"key":"2021011101554230700_CIT0022","first-page":"63","article-title":"Dynamics Based 3D Skeletal Hand Tracking","author":"Melax","year":"2013"},{"key":"2021011101554230700_CIT0023","first-page":"3633","article-title":"Accurate, Robust, and Flexible Real-Time Hand Tracking","author":"Sharp","year":"2015"},{"issue":"5","key":"2021011101554230700_CIT0024","doi-asserted-by":"crossref","first-page":"101","DOI":"10.1111\/cgf.12700","article-title":"Robust Articulated-ICP for Real-Time Hand Tracking","volume":"34","author":"Tagliasacchi","year":"2015","journal-title":"Computer Graphics Forum"},{"key":"2021011101554230700_CIT0025","first-page":"1","article-title":"Depth Camera Based Hand Gesture Recognition and Its Applications in Human-Computer-Interaction","author":"Ren","year":"2011"},{"key":"2021011101554230700_CIT0026","first-page":"223","article-title":"A Survey of Large High-Resolution Display Technologies, Techniques, and Applications","author":"Ni","year":"2006"},{"key":"2021011101554230700_CIT0027","first-page":"69","article-title":"Large Display Research Overview","author":"Czerwinski","year":"2006"},{"key":"2021011101554230700_CIT0028","first-page":"501","article-title":"Interaction Techniques for Wall-Sized Screens","author":"Lischke","year":"2015"},{"key":"2021011101554230700_CIT0029","first-page":"121","article-title":"Creating Principal 3d Curves With Digital Tape Drawing","author":"Grossman","year":"2002"},{"key":"2021011101554230700_CIT0030","first-page":"17","article-title":"Interaction Techniques for 3d Modeling on Large Displays","author":"Grossman","year":"2001"},{"key":"2021011101554230700_CIT0031","first-page":"49","article-title":"Interaction With 3D Models on Large Displays Using 3d Input Techniques","author":"Laundry","year":"2010"},{"key":"2021011101554230700_CIT0032","first-page":"1","article-title":"An Augmented Reality-Based Training System With a Natural User Interface for Manual Milling Operations","author":"Yang","year":"2019","journal-title":"Virtual Reality"},{"issue":"C","key":"2021011101554230700_CIT0033","doi-asserted-by":"crossref","first-page":"101","DOI":"10.1016\/j.patrec.2013.10.010","article-title":"Combining Multiple Depth-Based Descriptors for Hand Gesture Recognition","volume":"50","author":"Dominio","year":"2014","journal-title":"Pattern Recogn. Lett."},{"issue":"Part B","key":"2021011101554230700_CIT0034","doi-asserted-by":"crossref","first-page":"138","DOI":"10.1016\/j.pmcj.2012.07.003","article-title":"Activity Recognition on Streaming Sensor Data","volume":"10","author":"Krishnan","year":"2014","journal-title":"Pervasive Mob. Comput."},{"key":"2021011101554230700_CIT0035","first-page":"811","article-title":"Data Miming: Inferring Spatial Object Descriptions From Human Gesture","author":"Holz","year":"2011"},{"key":"2021011101554230700_CIT0036","doi-asserted-by":"crossref","first-page":"11","DOI":"10.1016\/j.cad.2015.06.006","article-title":"A Gesture-Free Geometric Approach for Mid-Air Expression of Design Intent in 3d Virtual Pottery","volume":"69","author":"Vinayak","year":"2015","journal-title":"Comput.-Aided Des."},{"issue":"5","key":"2021011101554230700_CIT0037","doi-asserted-by":"crossref","first-page":"6380","DOI":"10.3390\/s130506380","article-title":"Analysis of the Accuracy and Robustness of the Leap Motion Controller","volume":"13","author":"Weichert","year":"2013","journal-title":"Sensors"},{"issue":"1","key":"2021011101554230700_CIT0038","first-page":"32","article-title":"Validity and Reliability of Leap Motion Controller for Assessing Grasping and Releasing Finger Movements","volume":"17","author":"Okazaki","year":"2017","journal-title":"J. Ergon. Technol."},{"issue":"3","key":"2021011101554230700_CIT0039","doi-asserted-by":"crossref","first-page":"e0193639","DOI":"10.1371\/journal.pone.0193639","article-title":"Evaluation of the Leap Motion Controller During the Performance of Visually-Guided Upper Limb Movements","volume":"13","author":"Niechwiej-Szwedo","year":"2018","journal-title":"PLoS One"},{"key":"2021011101554230700_CIT0040","first-page":"1663","article-title":"Free-Hand Interaction With Leap Motion Controller for Stroke Rehabilitation","author":"Khademi","year":"2014"},{"key":"2021011101554230700_CIT0041","first-page":"142","article-title":"Implementations of the Leap Motion Device in Sound Synthesis and Interactive Live Performance","author":"Hantrakul","year":"2014"},{"key":"2021011101554230700_CIT0042","first-page":"1","article-title":"Air Painting With Corel Painter Freestyle and the Leap Motion Controller: a Revolutionary New Way to Paint!","author":"Sutton","year":"2013"}],"container-title":["Journal of Computing and Information Science in Engineering"],"original-title":[],"language":"en","link":[{"URL":"http:\/\/asmedigitalcollection.asme.org\/computingengineering\/article-pdf\/doi\/10.1115\/1.4047558\/6613543\/jcise_21_1_011006.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"http:\/\/asmedigitalcollection.asme.org\/computingengineering\/article-pdf\/doi\/10.1115\/1.4047558\/6613543\/jcise_21_1_011006.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2021,1,14]],"date-time":"2021-01-14T04:13:21Z","timestamp":1610597601000},"score":1,"resource":{"primary":{"URL":"https:\/\/asmedigitalcollection.asme.org\/computingengineering\/article\/doi\/10.1115\/1.4047558\/1084666\/Stroke-Hover-Intent-Recognition-for-Mid-Air-Curve"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,7,24]]},"references-count":42,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2021,2,1]]}},"URL":"https:\/\/doi.org\/10.1115\/1.4047558","relation":{},"ISSN":["1530-9827","1944-7078"],"issn-type":[{"value":"1530-9827","type":"print"},{"value":"1944-7078","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,7,24]]},"article-number":"011006"}}