Inspiration

A space mission cannot be deemed as successful if it fails at landing, which is the final stage. While SpaceX Starship can succeed in most missions, if it fails at the last stage, it's not meaningful for future lunar or Mars exploration missions. Meanwhile, potential damage to the launch pad as well as public properties could be caused, incurring huge economic loss.

What it does

ARENA is the last failsafe measure for ground operators to manually adjust the landing position if all previous steps failed. By using augmented reality, it can visualize the real-time position of the spacecraft in front of the user. The user on the other hand, can control the landing spacecraft by hand gesture, increasing the success rate of space missions. ARENA is not only usable by unmanned vehicle and returning missions; it can also assist with landing missions on the moon or the mars.

How I built it

I started prototyping the landing scene by composing assets in Reality Composer. I then programmed on Arduino Nano 33 Sense, which features a Nordic BLE chip and could transmit accelerator readings via Bluetooth LE (low energy). I then started weaving everything together including the iOS client application while adjusting the landing scene.

Challenges I ran into

The transmission of information from the Arduino Nano Sense board to the cell phone is a little bit tricky. Since the acceleration measures along 3 axis are floating point number, I have to use 4 Bytes for each measurement, which would be converted into a packet of 4 unsigned integer at the destination and converted further to floating point numbers conforming to IEEE-754. The endianness is tricky and I later realized that Arduino Nano Sense uses little endian instead of big endian. The next challenge I have encountered is caused by the asynchronous nature of the notification method in Reality Kit, which fires up hundreds of post request and caused some serious bugs. I used a Semaphore to deal with that issue.

Accomplishments that I'm proud of

I am able to prototype the application as well as demonstrating the PoC (proof of concept) before the submission deadline, as well as trying out Bluetooth low energy on iOS.

What I learned

I should plan ahead and plan earlier instead of attempting to finish everything at the last minute.

What's next for ARENA

The accuracy of gesture control could be improved by using machine learning and predictive analysis, as well as some common filtering techniques. Further, it would be better to have more enriched scenes for various types of landing missions. The coordinate display feature which is provided by EchoAR could also be used in the future.

Built With

Share this project:

Updates