Abstract
Bionic hands can replicate many movements of the human hand, but our ability to intuitively control these bionic hands is limited. Humans’ manual dexterity is partly due to control loops driven by sensory feedback. Here, we describe the integration of proximity and pressure sensors into a commercial prosthesis to enable autonomous grasping and show that continuously sharing control between the autonomous hand and the user improves dexterity and user experience. Artificial intelligence moved each finger to the point of contact while the user controlled the grasp with surface electromyography. A bioinspired dynamically weighted sum merged machine and user intent. Shared control resulted in greater grip security, greater grip precision, and less cognitive burden. Demonstrations include intact and amputee participants using the modified prosthesis to perform real-world tasks with different grip patterns. Thus, granting some autonomy to bionic hands presents a translatable and generalizable approach towards more dexterous and intuitive control.