Developing Digital Prosthetics via Employing Avatars, VR, and AR Environments

Augmented Cognition Lab (ACLab) primarily researches digital prosthetics. Digital prosthetics are a class of cognitive and neurological assistive devices and can be used for rehabilitation, Parkinson’s patients, diabetics, and individuals on the autism spectrum. This research requires expertise from a variety of fields, and she collaborates closely with members of the psychology, neuroscience, and rehabilitation departments. The key enabling technologies for these research are computer vision and augmented/virtual reality (AR/VR).

Examples of digital prosthetics include: (a) haptic and auditory feedback to help visually-impaired individuals navigate their environment rapidly, (b) realtime emotion processing to provide cueing and other assistance to the patients with autism as well as to better understand their interactions, and (c) using AR to draw short-range walking targets to prevent gait freezing in Parkinson’s patients. In all these cases, there are currently lower-tech assistive devices that are used, for instance in (c) caregivers use physical objects to accomplish the same thing. As with other prosthetic devices, digital prosthetics can increase freedom by reducing reliance on others for basic needs.

Digital prosthetics are only emerging recently because of the incredible technological hurdles. All of these applications require two important elements: (1) realtime understanding of the cognitive/neurological state of the user, and (2) realtime understanding of the state of the world. Professor Ostadabbas calls these twin problems “pose of the user” and “pose of the world”, where pose is a low-dimensional subspace storing relevant state of the user or of the world. Probably about 75-90% of her lab’s research effort is invested in solving these problems. Ostadabbas’ lab uses deep learning, physics-based generative models, and subspace modeling/factorization to develop solutions in this area.

Here are some of the snapshots of the some of the avatars that we use in our some of our projects:

emotions_Avatars        Idea graphs   Screenshot 2018-06-15 16.53.42

Selected Publications:

  • “Analysis of Multimodal Physiological Signals Within and Across Individuals to Predict Psychological Threat vs. Challenge,” 2017. [available at PsyArXiv].
  • “A Biosignal-Specific Processing Tool for Machine Learning and Pattern Recognition,” HI-POCT 2017.
  • “Decoding Emotional Experiences through Physiological Signal Processing,” ICASSP’17.
  • “The Effect of Gender in Virtual Reality Affective Interaction,” Annual conference of the Society for Affective Science (SAS), 2017.
  • “Using Virtual Humans to Understand Real Ones,” Northeastern 2016. 
  • “Decoding Emotional Experiences through Physiological Signal Processing,” Northeastern 2016. 

Blog Attachment