Feb 12: Researchers at the Seoul National University (SNU) and KAIST developed a soft wearable hand robot that can help persons with disabilities use their hands by using machine learning algorithm and sensory hardware..
Wearable Hand Robot
Professor Sungho Jo (KAIST) and Kyu-Jin Cho (Seoul National University), a collaboration research team in Soft Robotics Research Center (SRRC), Seoul, Korea, have proposed a new intention detection paradigm for soft wearable hand robots. The proposed paradigm predicts grasping/releasing intentions based on user behaviors, enabling the spinal cord injury (SCI) patients with lost hand mobility to pick-and-place objects. (The researchers include (KAIST) Daekyum Kim &Jeesoo Ha, (Seoul National University) Brian Byunghyun Kang, Kyu Bum Kim &Hyungmin Choi).
They developed a method based on a machine learning algorithm that predicts user intentions for wearable hand robots by utilizing a first-person-view camera. Their development is based on the hypothesis: user intentions can be inferred through the collection of user arm behaviors and hand-object interactions.
The machine learning model used in this study, Vision-based Intention Detection network from an EgOcentric view (VIDEO-Net), is designed based on this hypothesis. VIDEO-Net is composed of spatial and temporal sub-networks, where the temporal sub-network is to recognize user arm behaviors and the spatial sub-network is to recognize hand-object interactions.
An SCI patient wearing Exo-Glove Poly II , a soft wearable hand robot, successfully pick-and-place various objects and perform essential activities of daily living, such as drinking coffee without any additional helps.
Their development is advantageous in that it detects user intentions without requiring any person-to-person calibrations and additional actions. This enables the wearable hand robot to interact with humans seamlessly.
The research was published in the 26th issue of Science Robotics as a focus article on January 30, 2019.
Source: Seoul National University