{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,26]],"date-time":"2026-02-26T15:28:07Z","timestamp":1772119687165,"version":"3.50.1"},"reference-count":38,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2023,7,12]],"date-time":"2023-07-12T00:00:00Z","timestamp":1689120000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Robot. AI"],"abstract":"<jats:p><jats:bold>Introduction:<\/jats:bold>In Interactive Task Learning (ITL), an agent learns a new task through natural interaction with a human instructor. Behavior Trees (BTs) offer a reactive, modular, and interpretable way of encoding task descriptions but have not yet been applied a lot in robotic ITL settings. Most existing approaches that learn a BT from human demonstrations require the user to specify each action step-by-step or do not allow for adapting a learned BT without the need to repeat the entire teaching process from scratch.<\/jats:p><jats:p><jats:bold>Method:<\/jats:bold>We propose a new framework to directly learn a BT from only a few human task demonstrations recorded as RGB-D video streams. We automatically extract continuous pre- and post-conditions for BT action nodes from visual features and use a Backchaining approach to build a reactive BT. In a user study on how non-experts provide and vary demonstrations, we identify three common failure cases of an BT learned from potentially imperfect initial human demonstrations. We offer a way to interactively resolve these failure cases by refining the existing BT through interaction with a user over a web-interface. Specifically, failure cases or unknown states are detected automatically during the execution of a learned BT and the initial BT is adjusted or extended according to the provided user input.<\/jats:p><jats:p><jats:bold>Evaluation and results:<\/jats:bold>We evaluate our approach on a robotic trash disposal task with 20 human participants and demonstrate that our method is capable of learning reactive BTs from only a few human demonstrations and interactively resolving possible failure cases at runtime.<\/jats:p>","DOI":"10.3389\/frobt.2023.1152595","type":"journal-article","created":{"date-parts":[[2023,7,12]],"date-time":"2023-07-12T19:23:22Z","timestamp":1689189802000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":13,"title":["Interactively learning behavior trees from imperfect human demonstrations"],"prefix":"10.3389","volume":"10","author":[{"given":"Lisa","family":"Scherf","sequence":"first","affiliation":[]},{"given":"Aljoscha","family":"Schmidt","sequence":"additional","affiliation":[]},{"given":"Suman","family":"Pal","sequence":"additional","affiliation":[]},{"given":"Dorothea","family":"Koert","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2023,7,12]]},"reference":[{"key":"B1","first-page":"1268","article-title":"Learning manipulation actions from a few demonstrations","author":"Abdo","year":"2013"},{"key":"B2","first-page":"3460","article-title":"Autonomous acquisition of behavior trees for robot control","author":"Banerjee","year":"2018"},{"key":"B3","doi-asserted-by":"publisher","first-page":"372","DOI":"10.1109\/tro.2016.2633567","article-title":"How behavior trees modularize hybrid control systems and generalize sequential behavior compositions, the subsumption architecture, and decision trees","volume":"33","author":"Colledanchise","year":"2016","journal-title":"IEEE Trans. robotics"},{"key":"B4","doi-asserted-by":"crossref","DOI":"10.1201\/9780429489105","volume-title":"Behavior trees in robotics and ai","author":"Colledanchise","year":"2018"},{"key":"B5","doi-asserted-by":"publisher","first-page":"183","DOI":"10.1109\/tg.2018.2816806","article-title":"Learning of behavior trees for autonomous agents","volume":"11","author":"Colledanchise","year":"2018","journal-title":"IEEE Trans. Games"},{"key":"B6","first-page":"8839","article-title":"Towards blended reactive planning and acting using behavior trees","author":"Colledanchise","year":"2019"},{"key":"B7","first-page":"461","article-title":"Asking follow-up clarifications to resolve ambiguities in human-robot conversation","author":"Do\u011fan","year":"2022"},{"key":"B8","unstructured":"BehaviorTree.CPP FacontiD. 2018"},{"key":"B38","article-title":"Mood2be: Models and tools to design robotic behaviors","volume":"4","author":"Faconti","year":"2019","journal-title":"Tech. Rep"},{"key":"B9","first-page":"7791","article-title":"Learning behavior trees from demonstration","author":"French","year":"2019"},{"key":"B10","first-page":"522","article-title":"A human-aware method to plan complex cooperative and autonomous tasks using behavior trees","author":"Fusaro","year":"2021"},{"key":"B11","doi-asserted-by":"crossref","DOI":"10.1109\/RO-MAN57019.2023.10309435","article-title":"I3: Interactive iterative improvement for few-shot action segmentation","author":"Gassen","year":"2023"},{"key":"B12","doi-asserted-by":"crossref","first-page":"196","DOI":"10.1145\/3426425.3426942","article-title":"Behavior trees in action: A study of robotics applications","volume-title":"Proceedings of the 13th ACM SIGPLAN international conference on software language engineering","author":"Ghzouli","year":"2020"},{"key":"B13","article-title":"Combining context awareness and planning to learn behavior trees from demonstration","author":"Gustavsson","year":"2021"},{"key":"B14","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3457185","article-title":"Building the foundation of robot explanation generation using behavior trees","volume":"10","author":"Han","year":"2021","journal-title":"ACM Trans. Human-Robot Interact. (THRI)"},{"key":"B15","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1145\/3471985.3472385","article-title":"Cognitive architecture for intuitive and interactive task learning in industrial collaborative robotics","volume-title":"2021 the 5th international conference on robotics, control and automation","author":"Helenon","year":"2021"},{"key":"B16","doi-asserted-by":"crossref","DOI":"10.1109\/Humanoids53995.2022.10000088","article-title":"Interactive disambiguation for behavior tree execution","author":"Iovino","year":""},{"key":"B17","doi-asserted-by":"publisher","first-page":"104096","DOI":"10.1016\/j.robot.2022.104096","article-title":"A survey of behavior trees in robotics and ai","volume":"154","author":"Iovino","year":"","journal-title":"Robotics Aut. Syst."},{"key":"B18","first-page":"4591","article-title":"Learning behavior trees with genetic programming in unpredictable environments","author":"Iovino","year":"2021"},{"key":"B19","first-page":"514","article-title":"Guided robot skill learning: A user-study on learning probabilistic movement primitives with non-experts","author":"Knaust","year":"2021"},{"key":"B20","doi-asserted-by":"publisher","first-page":"6","DOI":"10.1109\/mis.2017.3121552","article-title":"Interactive task learning","volume":"32","author":"Laird","year":"2017","journal-title":"IEEE Intell. Syst."},{"key":"B21","first-page":"480","article-title":"Icub knows where you look: Exploiting social cues for interactive object detection learning","author":"Lombardi","year":"2022"},{"key":"B22","first-page":"5420","article-title":"Towards a unified behavior trees framework for robot control","author":"Marzinotto","year":"2014"},{"key":"B23","doi-asserted-by":"publisher","first-page":"39","DOI":"10.1109\/mis.2002.1024751","article-title":"A behavior language for story-based believable agents","volume":"17","author":"Mateas","year":"2002","journal-title":"IEEE Intell. Syst."},{"key":"B24","doi-asserted-by":"crossref","DOI":"10.1201\/9781315375229","volume-title":"Artificial intelligence for games","author":"Millington","year":"2018"},{"key":"B25","doi-asserted-by":"crossref","DOI":"10.1109\/CVPR.2015.7298895","article-title":"Clustering of Static-Adaptive correspondences for deformable object tracking","author":"Nebehay","year":"2015"},{"key":"B26","first-page":"564","article-title":"Costar: Instructing collaborative robots with behavior trees and vision","author":"Paxton","year":"2017"},{"key":"B27","first-page":"5966","article-title":"Semantic segmentation with active semi-supervised learning","author":"Rangnekar","year":"2023"},{"key":"B28","doi-asserted-by":"publisher","first-page":"297","DOI":"10.1146\/annurev-control-100819-063206","article-title":"Recent advances in robot learning from demonstration","volume":"3","author":"Ravichandar","year":"2020","journal-title":"Annu. Rev. control, robotics, Aut. Syst."},{"key":"B29","first-page":"1","article-title":"Building behavior trees from observations in real-time strategy games","author":"Robertson","year":"2015"},{"key":"B30","first-page":"6870","article-title":"Task planning with belief behavior trees","author":"Safronov","year":"2020"},{"key":"B31","doi-asserted-by":"publisher","first-page":"5","DOI":"10.1109\/tg.2017.2771831","article-title":"Trained behavior trees: Programming by demonstration to support ai game designers","volume":"11","author":"Sagredo-Olivenza","year":"2017","journal-title":"IEEE Trans. Games"},{"key":"B32","doi-asserted-by":"publisher","first-page":"23","DOI":"10.1162\/artl_a_00192","article-title":"Behavior trees for evolutionary robotics","volume":"22","author":"Scheper","year":"2016","journal-title":"Artif. life"},{"key":"B33","doi-asserted-by":"publisher","first-page":"40","DOI":"10.9781\/ijimai.2017.445","article-title":"Construction of a benchmark for the user experience questionnaire (ueq)","volume":"4","author":"Schrepp","year":"2017","journal-title":"Int. J. Interact. Multimedia Artif. Intell."},{"key":"B34","first-page":"979","article-title":"Graph-structured visual imitation","author":"Sieb","year":"2020"},{"key":"B35","first-page":"11511","article-title":"Combining planning and learning of behavior trees for robotic assembly","author":"Styrud","year":"2022"},{"key":"B36","doi-asserted-by":"publisher","first-page":"10643","DOI":"10.1109\/lra.2022.3194681","article-title":"Learning and executing re-useable behaviour trees from natural language instruction","volume":"7","author":"Suddrey","year":"2022","journal-title":"IEEE Robotics Automation Lett."},{"key":"B37","article-title":"Mediapipe hands: On-device real-time hand tracking","author":"Zhang","year":"2020"}],"container-title":["Frontiers in Robotics and AI"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2023.1152595\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,12,16]],"date-time":"2023-12-16T23:33:37Z","timestamp":1702769617000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2023.1152595\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,7,12]]},"references-count":38,"alternative-id":["10.3389\/frobt.2023.1152595"],"URL":"https:\/\/doi.org\/10.3389\/frobt.2023.1152595","relation":{},"ISSN":["2296-9144"],"issn-type":[{"value":"2296-9144","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,7,12]]},"article-number":"1152595"}}