Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 0:09:59
27 Jun 2022

Translating input modalities such as hand interactions, speech, and eye tracking in virtual reality offers an immersive user experience. Especially, it is crucial to track the user’s hand gestures, since they can help in translating user intentions into actions in a virtual environment. In this work, we developed a virtual reality application which incorporates electromyography-based deep learning methods for recognizing and estimating grasp movements in real-time, unlike previous works which are mostly tested in an offline fashion. Our application automates all user controls, so it can be used for rehabilitation purposes.

More Like This