{"id":1414,"date":"2020-05-04T13:59:22","date_gmt":"2020-05-04T13:59:22","guid":{"rendered":"https:\/\/beta.research.ece.ncsu.edu\/aros\/?page_id=1414"},"modified":"2020-06-18T12:43:56","modified_gmt":"2020-06-18T12:43:56","slug":"paper-tase2020-lowerlimb","status":"publish","type":"page","link":"https:\/\/research.ece.ncsu.edu\/aros\/paper-tase2020-lowerlimb\/","title":{"rendered":"Environmental Context Prediction for Lower Limb Prostheses"},"content":{"rendered":"<p>[et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;section&#8221; _builder_version=&#8221;3.22&#8243;][et_pb_row admin_label=&#8221;row&#8221; _builder_version=&#8221;3.25&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221;][et_pb_column type=&#8221;4_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; custom_padding__hover=&#8221;|||&#8221;][et_pb_image src=&#8221;https:\/\/research.ece.ncsu.edu\/wp-content\/uploads\/sites\/7\/2020\/05\/LowerLimb_2020-1.png&#8221; _builder_version=&#8221;4.4.0&#8243;][\/et_pb_image][\/et_pb_column][\/et_pb_row][et_pb_row _builder_version=&#8221;4.4.0&#8243;][et_pb_column type=&#8221;4_4&#8243; _builder_version=&#8221;4.4.0&#8243;][et_pb_text admin_label=&#8221;Title&#8221; _builder_version=&#8221;4.4.0&#8243;]<\/p>\n<h1>Environmental Context Prediction for Lower Limb Prostheses<\/h1>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][et_pb_row column_structure=&#8221;3_5,2_5&#8243; _builder_version=&#8221;4.4.0&#8243;][et_pb_column type=&#8221;3_5&#8243; _builder_version=&#8221;4.4.0&#8243;][et_pb_text admin_label=&#8221;Abstract&#8221; _builder_version=&#8221;4.4.0&#8243;]<\/p>\n<p>Reliable environmental context prediction is critical for wearable robots (e.g. prostheses and exoskeletons) to assist terrain-adaptive locomotion. This paper proposed a novel vision-based context prediction framework for lower limb prostheses to simultaneously predict human&#8217;s environmental context for multiple forecast windows. By leveraging <em>Bayesian Neural Networks<\/em> (BNN), our framework can quantify the uncertainty caused by different factors (e.g. observation noise, and insufficient or biased training) and produce a calibrated predicted probability for online decision making. We compared two wearable camera locations (a pair of glasses and a lower limb device), independently and conjointly. We utilized the calibrated predicted probability for online decision making and fusion. We demonstrated how to interpret deep neural networks with uncertainty measures and how to improve the algorithms based on the uncertainty analysis. The inference time of our framework on a portable embedded system was less than 80ms per frame. The results in this study may lead to novel context recognition strategies in reliable decision making, efficient sensor fusion and improved intelligent system design in various applications.<\/p>\n<p>This work was a collaboration between the ARoS Lab and the Neuromuscular Rehabilitation Engineering Laboratory (<a href=\"https:\/\/nrel.web.unc.edu\/\">NREL<\/a>) at UNC Chapel Hill \/ NC State. This work was supported by the National Science Foundation under award 1552828, 1563454 and 1926998.<\/p>\n<p>[\/et_pb_text][\/et_pb_column][et_pb_column type=&#8221;2_5&#8243; _builder_version=&#8221;4.4.0&#8243;][et_pb_text admin_label=&#8221;Resources&#8221; _builder_version=&#8221;4.4.0&#8243;]<\/p>\n<h2 style=\"text-align: left;\">Resources:<\/h2>\n<ul>\n<li><a href=\"https:\/\/ieeexplore.ieee.org\/document\/9098903\">IEEE <span class=\"il\">Trans. <\/span>on Automation Science and Engineering Paper<\/a><\/li>\n<li><a href=\"https:\/\/github.com\/ARoS-NCSU\/Reliable-Wearable-Robotics\/tree\/master\/Paper%20-%20TASE2020-lowerlimb\">Source Code in GitHub for Model<\/a><\/li>\n<li><a href=\"https:\/\/github.com\/ARoS-NCSU\/Reliable-Wearable-Robotics\/tree\/master\/RPI%20Data%20Aggregator%20%20-%20lowerlimb\">Source Code and Design Files in GitHub for Aggregator<\/a><\/li>\n<li><a href=\"https:\/\/ieee-dataport.org\/open-access\/lower-limb-prostheses-environmental-context-dataset\">Data in IEEE DataPort<\/a><\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/Cly0PJx9Gz4\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Environmental Context Prediction for Lower Limb ProsthesesReliable environmental context prediction is critical for wearable robots (e.g. prostheses and exoskeletons) to assist terrain-adaptive locomotion. This paper proposed a novel vision-based context [&hellip;]<\/p>\n","protected":false},"author":45,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"_et_pb_use_builder":"on","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"class_list":["post-1414","page","type-page","status-publish","hentry"],"acf":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/pages\/1414","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/users\/45"}],"replies":[{"embeddable":true,"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/comments?post=1414"}],"version-history":[{"count":17,"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/pages\/1414\/revisions"}],"predecessor-version":[{"id":1516,"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/pages\/1414\/revisions\/1516"}],"wp:attachment":[{"href":"https:\/\/research.ece.ncsu.edu\/aros\/wp-json\/wp\/v2\/media?parent=1414"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}