API: We made changes to the way the API handles waiting for the Photon's response before sending to the video player, and it's now streamlined and non-blocking. No other major changes have been necessary except for adding new Photon functions as they've been developed.
Photon: The IMU and a servo were integrated. Right now, the IMU lets us detect if the toy has been shaken or left idle. We also took advantage of the magnetometer to detect if an accessory part is on the toy or not. The servo replaces the shoulder joint of the left arm, which required us to 3D print an adapter. The arm now can move when we call on it to. Other hardware integrated: photoresistor, vibration motor.
Video player: The original player was a linear script. It was overhauled to be extensible, and now runs a loop that works through an infinitely expandable list of video objects. Our video class contains all the necessary parameters to tell the player when to do things with the toy, and what to do with the data returned from it. We built out a 3-layer demo tree, with 2 decision points and 4 possible paths.
Toy: Hardware has been integrated in the toy. The vibration motor connects to the torso piece and is strong enough to be felt even on the table the toy stands on. We put the photocell on the toy's forehead (we have a branch of our tree where you need to shield the hero's face from a blast). The IMU is secured in the waist area. We built a 5V regulated supply in one leg, and put the Photon in the other.
Log in or sign up for Devpost to join the conversation.