Before the hack, we had a little wander around London Zoo, we found that signage around enclosures was sometimes lacking, sporadic and very distributed also not always up to date. Also without a google we couldn't quickly donate to a specific animals cause and didn't always show info on conservation efforts and threats.
What it does
Zoo/Aquarium attendees would approach an enclosure and receive a notification on their smartphones that link to the relevant webpage for the animal in the enclosure. On that webpage there is a factsheet, We see two cards. The first showing generic facts, which would also include more info links. And the other, which changes colour dependent on risk, shows details about its conservation effort, this would also include a donate to the relevant link, maybe a wild coin link! (Zoohackathon LND 2016 winners). These are all kept up to date with its own calls to databases like Red List to get up-to-date conservation data. We also have a very nicely modelled elephant to add immersion and to give a quick visual reference. In the end creating geodata aware webpages, deployed with a context of aiding ecological awareness, knowledge and zoo experience without being intrusive and with the lowest barrier to entry, an app is never needed to be downloaded for this to work.
How we built it
The service works using physical web BLE beacons using the Eddystone standard to broadcast a relevant URL (currently a micro:bit) which on most modern smartphones is picked up as a notification when received, creating a simple proximity based geofencing technique. When the notification is pressed it launches the browser on the phone and serves that URL by calling a lavarel powered server which makes a call to our api powered by express.js and mongodb on a digital ocean droplet, and serves a react web app with the received data.
Challenges we ran into
- We had some issues with the eddystone url length, having to create a mapping and to make API consistent.
- Annoying CORS issues.
- Blender to Three.js JSON was a pain, especially with textures.
- Flashing mico:bit with c++ framework apps instead of default
- The Red List API was down so we couldn't receive the conservation information dynamically
Accomplishments that we're proud of
- We create a fully working web app that communicated with a working API that was all triggered by BLE beacons and gotten to a point with not much effort that it could scale and have a real effect. We were impressed we managed this in 24hrs.
- The detail of the model of the animal we could create in only about 12hrs
- Creating a starting point for an extensible API
- It looks pretty good and was in theme with ZSL current branding
What we learned
- We learned about how the physical web works
- learned the benefits and shortcomings of express and mongo as api framework and database.
- Continuous intergration on digital ocean with Lavarel forge and Git
What's next for zooBeacon
We would love to see funding for this to be able to deploy it into a zoo or aquarium to work with them and really improve it. A lot of the problems we have skipped could be fixed in weeks not months, and full deployment in a less than half a year with enough people. It's not hard tech, just underutilised tech and it really could do something very important to zoos and aquariums and bring them into the 21c and doing it unobtrusively, without breaking immersion.
wildlabs.net checked it while at the event. Here is is close to end of the hackathon - https://twitter.com/WILDLABSNET/status/917037837849161728