Inspiration
It would be hard to find someone who hasn't taken a personality test (i.e. Meyers Briggs, DISC assessment, Big Five personality tests, etc.). The problem is that the vast majority of these tests are not evidence-based and lack any predictive power. Also, people are often given these tests under high-stakes (job interview process) and are incentivized to modify their responses to fit into a certain group. To cut through the noise, we wanted to build personality profiles based on initial, gut reactions to simulated situations using VR.
What it does
Jump into a series of curated VR situations aimed at understanding your initial, gut reactions to difficult and ethical situations. While you are immersed, we are measuring a multitude of performance metrics! Our goal is to have around 10 fast-paced experiences that cover topics ranging from stress management, adaptability, creativity, problem solving, and philosophy. Over the past 24 hrs, we built two highly-polished scenarios to test the concept.
Target audience
We imagine that while this tool provides value, the initial investment in the VR setup is high. As a result, our target audience at first would be organizations (businesses, non-profits, military, law enforcement, government).
How we built it
Game engine: Unreal Engine 4
Using C++ scripts attached to Unreal, we were able to record performance metrics during the experience.
3D modeling: 3DS Max
Texturing: Substance Painter
VR headset: HTC Vive
Challenges we ran into
We wanted to collect biofeedback information such as respiratory rate or heart rate in addition to in-game performance metrics. We did a deep dive on heart rate using optical sensors, EKG breakout boards, and even tried to reverse engineer a pulse oximeter. In the end, we ran out of time on integrating biofeedback, but we learned a lot in the process!
Accomplishments that we are proud of
- Amount of data that we were able to collect
- The insights that we gathered
- Examined data from 40 individuals
- The amount of work that we were able to complete
- Being able to collaboratively work to develop a strong idea
- Utilizing VR as a predictive tool for personality tests
What we learned
VR
- How to correct for visual anomalies that occur in VR, such as spacial awareness
- How to record performance in real-time in Unreal Engine
- That VR can be a powerful tool for organizations that goes beyond its initial novelty and hype-factor
Biosensing
- That optical pulse sensor is susceptible to motion artifacts
- Typical 3 lead EKG sensors do not allow for freedom of motion either
Usability testing
- People's reactions vary greatly to physiological and psychological tasks
- What measurements would be valuable for feedback in next generations
What's next for ApperceptionVR
- Adding biofeedback (heart rate and/or respiratory rate)
- Building out a set of validated and curated situations (~10)
- Measure interpersonal metrics by enabling multiplayer support
- Record verbal reaction to tasks
- Watch eye movement
- Record extraneous movements of the limbs
- Record task speed
Log in or sign up for Devpost to join the conversation.