{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,25]],"date-time":"2026-03-25T05:31:03Z","timestamp":1774416663888,"version":"3.50.1"},"reference-count":39,"publisher":"American Association for the Advancement of Science (AAAS)","issue":"103","license":[{"start":{"date-parts":[[2026,6,25]],"date-time":"2026-06-25T00:00:00Z","timestamp":1782345600000},"content-version":"vor","delay-in-days":365,"URL":"https:\/\/www.science.org\/content\/page\/science-licenses-journal-article-reuse"}],"content-domain":{"domain":["www.science.org"],"crossmark-restriction":true},"short-container-title":["Sci. Robot."],"published-print":{"date-parts":[[2025,6,25]]},"abstract":"<jats:p>Force-sensing capabilities are essential for robot manipulation systems. However, commonly used wrist-mounted force\/torque sensors are heavy, fragile, and expensive, and tactile sensors require adding fragile circuitry to the robot fingers while only providing force information local to the contact. Here, we present a vision-based contact force estimator that serves as a more cost-effective and easier-to-implement alternative to existing force sensors by leveraging deformations of compliant hands upon contacts when compliant hands are in use. Our approach uses an estimator that visually observes a specialized compliant robot hand (available open source with easy fabrication through 3D printing) and predicts the contact force on the basis of its elastic deformation upon external forces. Because using wrist-mounted cameras to observe the gripper is common for robot manipulation systems, our method can obtain additional force information provided that the gripper is compliant. We optimized our compliant hand to minimize friction and avoid singularities in finger configurations, and we introduced memory to the estimator to combat the partial observability of the contact forces from the remaining friction and hysteresis. In addition, the estimator was made robust to background distractions and finger occlusions using vision foundation models to segment out the fingers. Although it is less accurate and slower than commercial force\/torque sensors, we experimentally demonstrated the accuracy and robustness of our estimator (achieving between 0.2 newton and 0.4 newton error) and its utility during a variety of manipulation tasks using the gripper in the presence of noisy backgrounds and occlusions.<\/jats:p>","DOI":"10.1126\/scirobotics.adq5046","type":"journal-article","created":{"date-parts":[[2025,6,25]],"date-time":"2025-06-25T17:59:24Z","timestamp":1750874364000},"update-policy":"https:\/\/doi.org\/10.34133\/aaas_crossmark","source":"Crossref","is-referenced-by-count":5,"title":["Forces for free: Vision-based contact force estimation with a compliant hand"],"prefix":"10.1126","volume":"10","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-4587-4305","authenticated-orcid":true,"given":"Yifan","family":"Zhu","sequence":"first","affiliation":[{"name":"Department of Mechanical Engineering and Materials Science, Yale University, New Haven, CT, USA."}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4287-6794","authenticated-orcid":true,"given":"Mei","family":"Hao","sequence":"additional","affiliation":[{"name":"Department of Mechanical Engineering and Materials Science, Yale University, New Haven, CT, USA."}]},{"ORCID":"https:\/\/orcid.org\/0009-0001-7604-3395","authenticated-orcid":true,"given":"Xupeng","family":"Zhu","sequence":"additional","affiliation":[{"name":"Khoury College of Computer Sciences, Northeastern University, Boston, MA, USA."}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6052-4170","authenticated-orcid":true,"given":"Quentin","family":"Bateux","sequence":"additional","affiliation":[{"name":"Department of Mechanical Engineering and Materials Science, Yale University, New Haven, CT, USA."}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3157-6016","authenticated-orcid":true,"given":"Alex","family":"Wong","sequence":"additional","affiliation":[{"name":"Department of Computer Science, Yale University, New Haven, CT, USA."}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2409-4668","authenticated-orcid":true,"given":"Aaron M.","family":"Dollar","sequence":"additional","affiliation":[{"name":"Department of Mechanical Engineering and Materials Science, Yale University, New Haven, CT, USA."}]}],"member":"221","reference":[{"key":"e_1_3_2_2_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2022.3152244"},{"key":"e_1_3_2_3_2","doi-asserted-by":"crossref","unstructured":"N. Doshi O. Taylor A. Rodriguez \u201cManipulation of unknown objects via contact configuration regulation\u201d in Proceedings of IEEE International Conference on Robotics and Automation (IEEE 2022) pp. 2693\u20132699.","DOI":"10.1109\/ICRA46639.2022.9811713"},{"key":"e_1_3_2_4_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2020.3010462"},{"key":"e_1_3_2_5_2","doi-asserted-by":"publisher","DOI":"10.1177\/0278364919880257"},{"key":"e_1_3_2_6_2","doi-asserted-by":"crossref","unstructured":"S. Kim D. K. Jha D. Romeres P. Patre A. Rodriguez \u201cSimultaneous tactile estimation and control of extrinsic contact\u201d in Proceedings of IEEE International Conference on Robotics and Automation (IEEE 2023) pp. 12563\u201312569.","DOI":"10.1109\/ICRA48891.2023.10161158"},{"key":"e_1_3_2_7_2","doi-asserted-by":"crossref","unstructured":"R. Ouyang R. Howe \u201cLow-cost fiducial-based 6-axis force-torque sensor\u201d in IEEE International Conference on Robotics and Automation (IEEE 2020) pp. 1653\u20131659.","DOI":"10.1109\/ICRA40945.2020.9196925"},{"key":"e_1_3_2_8_2","doi-asserted-by":"publisher","DOI":"10.1109\/TMECH.2014.2300333"},{"key":"e_1_3_2_9_2","doi-asserted-by":"publisher","DOI":"10.1038\/s41598-021-97003-1"},{"key":"e_1_3_2_10_2","doi-asserted-by":"publisher","DOI":"10.3389\/frobt.2021.631371"},{"key":"e_1_3_2_11_2","doi-asserted-by":"publisher","DOI":"10.1016\/0094-114X(78)90059-9"},{"key":"e_1_3_2_12_2","doi-asserted-by":"publisher","DOI":"10.1002\/adma.201707035"},{"key":"e_1_3_2_13_2","doi-asserted-by":"publisher","DOI":"10.1109\/TMECH.2006.871090"},{"key":"e_1_3_2_14_2","doi-asserted-by":"publisher","DOI":"10.3389\/frobt.2016.00070"},{"key":"e_1_3_2_15_2","doi-asserted-by":"publisher","DOI":"10.1002\/anie.201006464"},{"key":"e_1_3_2_16_2","doi-asserted-by":"publisher","DOI":"10.1109\/TRO.2022.3156806"},{"key":"e_1_3_2_17_2","doi-asserted-by":"crossref","unstructured":"A. M. Dollar R. D. Howe \u201cThe SDM hand: A highly adaptive compliant grasper for unstructured environments\u201d in Experimental Robotics. Springer Tracts in Advanced Robotics vol. 54 O. Khatib V. Kumar G. J. Pappas Eds. (Springer 2009) pp. 3\u201311.","DOI":"10.1007\/978-3-642-00196-3_2"},{"key":"e_1_3_2_18_2","doi-asserted-by":"publisher","DOI":"10.1177\/0278364913514466"},{"key":"e_1_3_2_19_2","doi-asserted-by":"publisher","DOI":"10.1109\/TASE.2013.2240298"},{"key":"e_1_3_2_20_2","doi-asserted-by":"crossref","unstructured":"T. Z. Zhao V. Kumar S. Levine C. Finn \u201cLearning fine-grained bimanual manipulation with low-cost hardware\u201d in Proceedings of Robotics: Science and Systems (RSS Foundation 2023).","DOI":"10.15607\/RSS.2023.XIX.016"},{"key":"e_1_3_2_21_2","doi-asserted-by":"crossref","unstructured":"C. Chi Z. Xu C. Pan E. Cousineau B. Burchfiel S. Feng R. Tedrake S. Song \u201cUniversal manipulation interface: In-the-wild robot teaching without in-the-wild robots\u201d in Proceedings of Robotics: Science and Systems (RSS Foundation 2024).","DOI":"10.15607\/RSS.2024.XX.045"},{"key":"e_1_3_2_22_2","doi-asserted-by":"crossref","unstructured":"S. Cremer L. Mastromoro D. O. Popa \u201cOn the performance of the Baxter research robot\u201d in Proceedings of IEEE International Symposium on Assembly and Manufacturing (ISAM) (IEEE 2016) pp. 106\u2013111.","DOI":"10.1109\/ISAM.2016.7750722"},{"key":"e_1_3_2_23_2","doi-asserted-by":"crossref","unstructured":"N. Elangovan A. Dwivedi L. Gerez C.-M. Chang M. Liarokapis \u201cEmploying IMU and ArUco marker based tracking to decode the contact forces exerted by adaptive hands\u201d in Proceedings of 2019 IEEE-RAS 19th International Conference on Humanoid Robots (Humanoids) (IEEE 2019) pp. 525\u2013530.","DOI":"10.1109\/Humanoids43949.2019.9035051"},{"key":"e_1_3_2_24_2","doi-asserted-by":"crossref","unstructured":"J. A. Collins P. Grady C. C. Kemp \u201cForce\/torque sensing for soft grippers using an external camera\u201d in Proceedings of IEEE International Conference on Robotics and Automation (ICRA) (IEEE 2023) pp. 2620\u20132626.","DOI":"10.1109\/ICRA48891.2023.10161257"},{"key":"e_1_3_2_25_2","doi-asserted-by":"crossref","unstructured":"J. A. Collins C. Houff P. Grady C. C. Kemp \u201cVisual contact pressure estimation for grippers in the wild\u201d in Proceedings of IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE 2023) pp. 10947\u201310954.","DOI":"10.1109\/IROS55552.2023.10342124"},{"key":"e_1_3_2_26_2","unstructured":"H. H. Nguyen A. Baisero D. Klee D. Wang R. Platt C. Amato \u201cEquivariant reinforcement learning under partial observability\u201d in Proceedings of the 7th Conference on Robot Learning (PMLR 2023) pp. 3309\u20133320."},{"key":"e_1_3_2_27_2","unstructured":"A. Baisero C. Amato Unbiased asymmetric reinforcement learning under partial observability. arXiv: 2105.11674 (2021)."},{"key":"e_1_3_2_28_2","doi-asserted-by":"crossref","unstructured":"Y. Xiao S. Katt A. T. Pas S. Chen C. Amato \u201cOnline planning for target object search in clutter under partial observability\u201d in Proceedings of International Conference on Robotics and Automation (ICRA) (IEEE 2019) pp. 8241\u20138247.","DOI":"10.1109\/ICRA.2019.8793494"},{"key":"e_1_3_2_29_2","doi-asserted-by":"crossref","unstructured":"A. Kirillov E. Mintun N. Ravi H. Mao C. Rolland L. Gustafson T. Xiao S. Whitehead A. C. Berg W.-Y. Lo P. Doll\u00e1r R. Girshick Segment anything. arXiv: 2304.02643 [cs.CV] (2023).","DOI":"10.1109\/ICCV51070.2023.00371"},{"key":"e_1_3_2_30_2","doi-asserted-by":"crossref","unstructured":"A. S. Morgan Q. Bateux M. Hao A. M. Dollar \u201cTowards generalized robot assembly through compliance-enabled contact formations\u201d in Proceedings of IEEE International Conference on Robotics and Automation (IEEE 2023) pp. 8010\u20138016.","DOI":"10.1109\/ICRA48891.2023.10161073"},{"key":"e_1_3_2_31_2","doi-asserted-by":"publisher","DOI":"10.1109\/JSEN.2021.3123638"},{"key":"e_1_3_2_32_2","unstructured":"L. Kim Y. Li M. Posa D. Jayaraman \u201cIm2Contact: Vision-based contact localization without touch or force sensing\u201d in Proceedings of the 7th Conference on Robot Learning (PMLR 2023)."},{"key":"e_1_3_2_33_2","doi-asserted-by":"publisher","DOI":"10.3390\/app11094303"},{"key":"e_1_3_2_34_2","doi-asserted-by":"crossref","unstructured":"K. M. Lynch F. C. Park Modern Robotics (Cambridge Univ. Press. 2017).","DOI":"10.1017\/9781316661239"},{"key":"e_1_3_2_35_2","doi-asserted-by":"crossref","unstructured":"K. He X. Zhang S. Ren J. Sun \u201cDeep residual learning for image recognition\u201d in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (IEEE 2016) pp. 770\u2013778.","DOI":"10.1109\/CVPR.2016.90"},{"key":"e_1_3_2_36_2","unstructured":"A. Vaswani N. Shazeer N. Parmar J. Uszkoreit L. Jones A. N. Gomez \u0141. Kaiser I. Polosukhin \u201cAttention is all you need\u201d in Proceedings of Advances in Neural Information Processing Systems (ACM 2017) pp. 6000\u20136010."},{"key":"e_1_3_2_37_2","unstructured":"I. Loshchilov F. Hutter \u201cDecoupled weight decay regularization\u201d in Proceedings of the International Conference on Learning Representations (ICLR 2019) pp. 1\u20138."},{"key":"e_1_3_2_38_2","doi-asserted-by":"crossref","unstructured":"A. Quattoni A. Torralba \u201cRecognizing indoor scenes\u201d in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE 2009) pp. 413\u2013420.","DOI":"10.1109\/CVPR.2009.5206537"},{"key":"e_1_3_2_39_2","doi-asserted-by":"publisher","DOI":"10.1109\/MRA.2015.2448951"},{"key":"e_1_3_2_40_2","doi-asserted-by":"publisher","DOI":"10.1109\/MRA.2016.2639034"}],"container-title":["Science Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.science.org\/doi\/pdf\/10.1126\/scirobotics.adq5046","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/www.science.org\/doi\/pdf\/10.1126\/scirobotics.adq5046","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,25]],"date-time":"2025-06-25T17:59:31Z","timestamp":1750874371000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.science.org\/doi\/10.1126\/scirobotics.adq5046"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,6,25]]},"references-count":39,"journal-issue":{"issue":"103","published-print":{"date-parts":[[2025,6,25]]}},"alternative-id":["10.1126\/scirobotics.adq5046"],"URL":"https:\/\/doi.org\/10.1126\/scirobotics.adq5046","relation":{},"ISSN":["2470-9476"],"issn-type":[{"value":"2470-9476","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,6,25]]},"assertion":[{"value":"2024-05-17","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-05-28","order":2,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-06-25","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}],"article-number":"eadq5046"}}