Vision-based interface for grasping intention detection and grip selection : towards intuitive upper-limb assistive devices
Assistive devices for indivuals with upper-limb movement often lack controllability and intuitiveness, in particular for grasping function. In this work, we introduce a novel user interface for grasping movement control in which the user delegates the grasping task decisions to the device, only moving their (potentially prosthetic) hand toward the targeted object.
PDF AbstractTasks
Datasets
Add Datasets
introduced or used in this paper
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
No methods listed for this paper. Add
relevant methods here