Investigating Trust and Nonverbal Behavior through Virtual Worlds
One problem with existing research on age differences and trust involves the reliance on judgments of faces; these faces may or may not display veridical cues regarding trustworthiness. That is, consensus exists about what faces appear trustworthy, but the actual validity of the static facial cues in predicting actual behavior has been called into question. Because it can be difficult to manipulate such a behavioral stream when produced by other individuals for the purposes of lab research, researchers have turned to alternative methods to manipulate and study trust-related behavior.
In this project, we embark on using a virtual environment that allows us to explore and shape an individual’s thought process as it comes to trust in decisions involving financial transactions. Building on an existing interactive virtual platform developed by Magy Seif El-Nasr and students, we are developing a virtual environment in which participants of different ages will make financial decisions. The environment will consist of a multiplayer fully rendered 3D virtual environment inhabited by non-player characters controlled by Artificial Intelligence algorithms and player characters pupeteered through a kinect based interface (gestural vision based system). The players will be able to control their pupeteered characters to exhibit several non-verbal behavior. Both the pupeteered and AI characters will use previous work developed on nonverbal behavior and animation system. We are using the SmartBody character animation system, developed by Stacy Marsella et al. at ICT, USC. The system provides many critical capabilities for the representation, interaction and visualization of 3D characters in virtual environments model.
PhD Students: Elin Carstensdottir
Masters Students: Peili Cao, Rhea Xu, Karthik Chandrakanth
Post Doctoral Fellow: Truong Huy Nguyen Dinh
Faculty Members: Magy Seif El-Nasr (PI), David DeSteno (Co-PI), Derek Isaacowitz (Co-PI), and Matt Gray (Co-PI).
Funding: The project is currently funded by Northeastern University Interdisciplinary Tier 1 grant.