Inspiration

Large food, drink and snack brands spend billions of dollars each year to release new products. The packaging of these products is often decided by a handful of marketeers/ designers, because it would be too expensive to conduct a large consumer A/B test with different packagings in real stores. With virtual reality we can offer a way to do exactly that. Our software displays a new product on a (virtual) shelf in an Oculus Rift that the consumer is wearing. Different groups of consumers see the same product with different packagings (A/B test). By using eye tracking and a brain wave measuring device, we find out which packaging leads to most consumers seeing the product and becoming interested or even excited about it. With this information the company behind that particular product can significantly increase the revenue potential of their product and decrease the risk that it will become a flop.

What it does

We show the consumer an entire shelf of products, one of which is the one that is being tested. (In this demo, we show only the product.) We quantify the user behaviour by tracking brain waves with an EEG device on her head as well es eye movement, through the directions in which the user moves her head.

How we built it

As hardware we are using an Emotiv Insight EEG headset, an Oculus Rift headset and a Leap Hand Motion sensor. The official graphing software for the EEG headset was built in JavaScript. The game engine for the VR visualization was built in Unity. We built the VR animation in C#. For the hand movement tracking we integrated leap motion SDK with the Unity VR environment. We have recreated an original product in 3D and created alternative packagings for it using Cinema 4D. We are doing the analysis using R.

Challenges we ran into

The EEG headset bluetooth connection caused us some issues. In the end we managed to fix it. The JavaScript software which we wrote had a connection and it worked in parts, but some parts didn’t work. We tried to code it in Java but ran into the same problem. We were able to circumvent this by using the official software instead. Another challenge was to synchronize the data from the EEG headset and the VR headset, because due to limited computing power we had to run the two headsets via two separate computers. This we solved by running VR and EEG data capture on the two separate machines and we ran video screen capture to be able to synchronize afterwards.

Accomplishments that we're proud of

The hand tracking was very challenging and we managed to implement it. Through our use of EEG tracking and hand motion tracking we are able to retrieve non-biased user reactions. Thus our test is better thahttps://github.com/paulkunze/VR-ifyn a typical survey, where consumers are simply asked which packaging design they prefer.

What we learned

We were positively surprised about how well this technology fits together for this use case. This is totally possible!

What's next for VR-ify

Develop a single piece of software to consolidate and synchronize the entire technology we use. On the hardware side, integrate the EEG scanner into the VR headset so it becomes easier to mount on the subject. Identifying the first company to conduct a pilot test.

Built With

Share this project:
×

Updates