{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,30]],"date-time":"2026-04-30T17:00:17Z","timestamp":1777568417831,"version":"3.51.4"},"reference-count":17,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2022,9,30]],"date-time":"2022-09-30T00:00:00Z","timestamp":1664496000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,9,30]],"date-time":"2022-09-30T00:00:00Z","timestamp":1664496000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100008530","name":"EC | European Regional Development Fund","doi-asserted-by":"publisher","award":["POCI-01-0247-FEDER-39479"],"award-info":[{"award-number":["POCI-01-0247-FEDER-39479"]}],"id":[{"id":"10.13039\/501100008530","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100008530","name":"EC | European Regional Development Fund","doi-asserted-by":"publisher","award":["POCI-01-0247-FEDER-39479"],"award-info":[{"award-number":["POCI-01-0247-FEDER-39479"]}],"id":[{"id":"10.13039\/501100008530","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100008530","name":"EC | European Regional Development Fund","doi-asserted-by":"publisher","award":["POCI-01-0247-FEDER-39479"],"award-info":[{"award-number":["POCI-01-0247-FEDER-39479"]}],"id":[{"id":"10.13039\/501100008530","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100008530","name":"EC | European Regional Development Fund","doi-asserted-by":"publisher","award":["POCI-01-0247-FEDER-39479"],"award-info":[{"award-number":["POCI-01-0247-FEDER-39479"]}],"id":[{"id":"10.13039\/501100008530","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100008530","name":"EC | European Regional Development Fund","doi-asserted-by":"publisher","award":["POCI-01-0247-FEDER-39479"],"award-info":[{"award-number":["POCI-01-0247-FEDER-39479"]}],"id":[{"id":"10.13039\/501100008530","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100003381","name":"Minist\u00e9rio da Educa\u00e7\u00e3o e Ci\u00eancia","doi-asserted-by":"publisher","award":["SFRH\/BD\/151382\/2021"],"award-info":[{"award-number":["SFRH\/BD\/151382\/2021"]}],"id":[{"id":"10.13039\/501100003381","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Sci Data"],"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Wearable technology is expanding for motion monitoring. However, open challenges still limit its widespread use, especially in low-cost systems. Most solutions are either expensive commercial products or lower performance ad-hoc systems. Moreover, few datasets are available for the development of complete and general solutions. This work presents 2 datasets, with low-cost and high-end Magnetic, Angular Rate, and Gravity(MARG) sensor data. Provides data for the complete inertial pose pipeline analysis, starting from raw data, sensor-to-segment calibration, multi-sensor fusion, skeleton-kinematics, to complete Human pose. Contains data from 21 and 10 participants, respectively, performing 6 types of sequences, presenting high variability and complex dynamics with almost complete range-of-motion. Amounts to 3.5\u2009M samples, synchronized with a ground-truth inertial motion capture system. Presents a method to evaluate data quality. This database may contribute to develop novel algorithms for each pipeline\u2019s processing steps, with applications in inertial pose estimation algorithms, human movement forecasting, and motion assessment in industrial or rehabilitation settings. All data and code to process and analyze the complete pipeline is freely available.<\/jats:p>","DOI":"10.1038\/s41597-022-01690-y","type":"journal-article","created":{"date-parts":[[2022,9,30]],"date-time":"2022-09-30T08:04:20Z","timestamp":1664525060000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":15,"title":["From raw measurements to human pose - a dataset with low-cost and high-end inertial-magnetic sensor data"],"prefix":"10.1038","volume":"9","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-3501-9217","authenticated-orcid":false,"given":"Manuel","family":"Palermo","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8097-5507","authenticated-orcid":false,"given":"Sara M.","family":"Cerqueira","sequence":"additional","affiliation":[]},{"given":"Jo\u00e3o","family":"Andr\u00e9","sequence":"additional","affiliation":[]},{"given":"Ant\u00f3nio","family":"Pereira","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-0023-7203","authenticated-orcid":false,"given":"Cristina P.","family":"Santos","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,9,30]]},"reference":[{"key":"1690_CR1","doi-asserted-by":"crossref","unstructured":"Camomilla, V., Bergamini, E., Fantozzi, S. & Vannozzi, G. Trends supporting the in-field use of wearable inertial sensors for sport performance evaluation: A systematic review. Sensors (2018).","DOI":"10.3390\/s18030873"},{"key":"1690_CR2","doi-asserted-by":"crossref","unstructured":"Lopez-Nava, I. H. & Munoz-Melendez, A. Wearable inertial sensors for human motion analysis: A review. IEEE Sensors Journal (2016).","DOI":"10.1109\/JSEN.2016.2609392"},{"key":"1690_CR3","doi-asserted-by":"crossref","unstructured":"Huang, Y. et al. Deep inertial poser: Learning to reconstruct human pose from sparse inertial measurements in real time. ACM Transactions on Graphics, (Proc. SIGGRAPH Asia) (2018).","DOI":"10.1145\/3272127.3275108"},{"key":"1690_CR4","doi-asserted-by":"crossref","unstructured":"Trumble, M., Gilbert, A., Malleson, C., Hilton, A. & Collomosse, J. Total capture: 3d human pose estimation fusing video and inertial sensors. In 2017 British Machine Vision Conference (BMVC) (2017).","DOI":"10.5244\/C.31.14"},{"key":"1690_CR5","doi-asserted-by":"publisher","DOI":"10.1038\/s41597-020-0563-y","author":"Y Luo","year":"2020","unstructured":"Luo, Y. et al. A database of human gait performance on irregular and uneven surfaces collected by wearable sensors. Scientific Data https:\/\/doi.org\/10.1038\/s41597-020-0563-y (2020)."},{"key":"1690_CR6","unstructured":"Roetenberg, D., Luinge, H. & Slycke, P. Xsens mvn: Full 6dof human motion tracking using miniature inertial sensors. Xsens Motion Technologies BV, Tech. Rep (2009)."},{"key":"1690_CR7","doi-asserted-by":"publisher","unstructured":"Choe, N., Zhao, H., Qiu, S. & So, Y. A sensor-to-segment calibration method for motion capture system based on low cost mimu. Measurement https:\/\/doi.org\/10.1016\/j.measurement.2018.07.078 (2019).","DOI":"10.1016\/j.measurement.2018.07.078"},{"key":"1690_CR8","doi-asserted-by":"crossref","unstructured":"Mahmood, N., Ghorbani, N., Troje, N. F., Pons-Moll, G. & Black, M. J. AMASS: Archive of motion capture as surface shapes. In International Conference on Computer Vision (2019).","DOI":"10.1109\/ICCV.2019.00554"},{"key":"1690_CR9","doi-asserted-by":"publisher","unstructured":"Resende, A. et al. Ergowear: an ambulatory, non-intrusive, and interoperable system towards a human-aware human-robot collaborative framework. In 2021 IEEE International Conference on Autonomous Robot Systems and Competitions (ICARSC), https:\/\/doi.org\/10.1109\/ICARSC52212.2021.9429796 (2021).","DOI":"10.1109\/ICARSC52212.2021.9429796"},{"key":"1690_CR10","doi-asserted-by":"publisher","DOI":"10.13140\/RG.2.2.23576.49929","author":"M Paulich","year":"2018","unstructured":"Paulich, M., Schepers, M., Rudigkeit, N. & Bellusci, G. Xsens mtw awinda: Miniature wireless inertial-magnetic motion tracker for highly accurate 3d kinematic applications https:\/\/doi.org\/10.13140\/RG.2.2.23576.49929 (2018).","journal-title":"Xsens mtw awinda: Miniature wireless inertial-magnetic motion tracker for highly accurate 3d kinematic applications"},{"key":"1690_CR11","doi-asserted-by":"crossref","unstructured":"Al-Amri, M. et al. Inertial measurement units for clinical movement analysis: reliability and concurrent validity. Sensors (2018).","DOI":"10.3390\/s18030719"},{"key":"1690_CR12","unstructured":"Ribeiro, N. IMUs: validation, gait analysis and system\u2019s implementation. Master\u2019s thesis, University of Minho (2017)."},{"key":"1690_CR13","doi-asserted-by":"publisher","unstructured":"Madgwick, S. O. H., Harrison, A. J. L. & Vaidyanathan, R. Estimation of imu and marg orientation using a gradient descent algorithm. In 2011 IEEE International Conference on Rehabilitation Robotics, https:\/\/doi.org\/10.1109\/ICORR.2011.5975346 (2011).","DOI":"10.1109\/ICORR.2011.5975346"},{"key":"1690_CR14","unstructured":"Hansen, N., Ostermeier, A. & Gawelczyk, A. On the adaptation of arbitrary normal mutation distributions in evolution strategies: The generating set adaptation. In ICGA, 57\u201364 (Citeseer, 1995)."},{"key":"1690_CR15","doi-asserted-by":"publisher","DOI":"10.5281\/zenodo.5801927","author":"M Palermo","year":"2022","unstructured":"Palermo, M., Cerqueira, S., Andr\u00e9, J. & Santos, CP. Complete Inertial Pose (CIP) Dataset, Zenodo, https:\/\/doi.org\/10.5281\/zenodo.5801927 (2022)."},{"key":"1690_CR16","doi-asserted-by":"crossref","unstructured":"Huynh, D. Q. Metrics for 3d rotations: Comparison and analysis. Journal of Mathematical Imaging and Vision (2009).","DOI":"10.1007\/s10851-009-0161-2"},{"key":"1690_CR17","doi-asserted-by":"publisher","DOI":"10.3390\/s18030719","author":"M Al-Amri","year":"2018","unstructured":"Al-Amri, M. et al. Inertial measurement units for clinical movement analysis: Reliability and concurrent validity. Sensors https:\/\/doi.org\/10.3390\/s18030719 (2018)."}],"container-title":["Scientific Data"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.nature.com\/articles\/s41597-022-01690-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.nature.com\/articles\/s41597-022-01690-y","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.nature.com\/articles\/s41597-022-01690-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,9,30]],"date-time":"2022-09-30T08:27:09Z","timestamp":1664526429000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.nature.com\/articles\/s41597-022-01690-y"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,9,30]]},"references-count":17,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2022,12]]}},"alternative-id":["1690"],"URL":"https:\/\/doi.org\/10.1038\/s41597-022-01690-y","relation":{"references":[{"id-type":"doi","id":"10.1038\/s41597-020-0563-y","asserted-by":"subject"},{"id-type":"doi","id":"10.5281\/zenodo.5801927","asserted-by":"subject"},{"id-type":"doi","id":"10.3390\/s18030719","asserted-by":"subject"}]},"ISSN":["2052-4463"],"issn-type":[{"value":"2052-4463","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,9,30]]},"assertion":[{"value":"16 February 2022","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"4 September 2022","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"30 September 2022","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"The authors declare no competing interests.","order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"591"}}