Inspiration

One of our team just went through Hurricane Irma, and we were able to integrate his experience into the design of our concept.

Coming from a technology background, our inspiration stemmed primarily from the concepts of Agile, Lean and Just-In-Time solutions which are prevalent in the industry. Disaster response and management are fields that place a premium on resource management and efficient use and distribution of people. The goal of our project is not just a product, but to effectively combine the strengths of emerging technologies, technology best practices, and disaster response.

Performing interviews with dispatchers, recent hurricane victims, emergency response personnel, and crisis management personnel gave not only more complete user stories, but actionable experience-based design insights.

The usage of cross-domain application of technology best practices allowed us to make the most of the sponsors challenge, interviews, and personal experience.

What it does

Our application helps people communicate on an individual scale at scale to coordinate rescue and relief efforts – improving lives affected after catastrophic disasters.It creates an ad-hoc communication and coordination platform for disaster response organization, with a secondary use for disaster victims to help themselves and others and effectively triage/prioritize objectives for first responders. By facilitating communications and coordination our application helps collaboratively gathering, processing, and disseminating information – this expedites the response from disaster to response.

Accenture, the sponsor, noted that technology could be used to fill a critical information and communication gap when responding to catastrophic disasters. In such a crisis, the information gap leads to coordination failure that prevent the community from improving lives of victims.

The identified needs required in rescue and relief efforts are:

Victims: need a way to request support and indicate urgency of the situation Empower the Victims

Rescuers: need an integrated communications channel to locate victims Enable the Able

Dispatchers: need to triage victims and assign appropriate resources

Shelter Operators: need to direct victims to open capacity and request volunteers and supplies

Our application solves the pain points of the following personas by allowing them to:

Victims: use application to share location and status and communicate.

Rescuers: Use sensor tech (LIDAR modules and GPS data) and dispatch’s communications to update and coordinate rescuer efforts real-time.

Dispatchers: use application to coordinate victims and rescue operations (maintaining real-time collaborative situational awareness).

Shelter Ops: are looped in with info from dispatch to be aware of the needs they will have to meet.

How I built it

We walked into this hackathon expecting to make an IoT close-space mapping tool using LiDAR because we wanted to, but the sponsors’ challenge and the experience of one of our team had going through Hurricane Irma inspired us to pivot to a communication and collaboration response and recovery application at about 6:30-7:00PM on Saturday.

We use GPS location and communication of ThingSpace as well as the SMS capability of Nexmo for multiple layers of redundancy by using all channels from low-bandwidth SMS to Internet connectivity as a failover to insure connectivity doesn’t limit communication.

Our end product showcases the positive ways that a number of sponsors' technologies integrate into our product plan and create reliability through redundancy. It’s interesting that effective strategies for building a hackathon product are also effective strategies for building a disaster-response application. A brief walk through the sponsor technology we used, from a perspective of the User Personas provided.

Disaster Victim Persona: Communicate to the outside world (rescuers/dispatchers/shelter) by checking-in and indicating their status. Communicate to other victims (potentially offering help if needed and capable) using ThingSpace to find their location and ThingSpace Device Messaging to facilitate simple communication.

Primary Technlogies used for this Persona: -ThingSpace (mapping, Device Messaging) -Nexmo (SMS receiving)

Disaster Response (Rescuer) Persona: With precise lat/long coordinates from Thingspace and backup SMS pulses from Nexmo, Rescuers can stay on top of disaster situations and receive efficient communications from dispatch. Prioritizing for resources such as device battery power (using SMS rather than internet, low precision GPS), speed/time (high precision GPS, quicker updates) and mapping disaster zones automatically with the Helpmet, which automates searches and integrates the data into available databases.

Primary Technologies: -LiDAR (Sweep, Qbotics Labs) integrated helmet (“Helpmet”) -ThingSpace (mapping, Device Messaging)

Secondary Technologies: -Nexmo (SMS receiving)

Dispatcher Persona: ThingSpace (through tools like Freeboard) allows for real-time interactive dashboards and visualization with flexible input from devices and other data sources. Dispatchers are free to work from any internet device if necessary, and potentially allows merging of the rescuer/dispatcher role for maximum flexibility. Stretch goals include implementing machine learning to assist dispatch by sorting the increased traffic by severity using the same logic and standard practices that dispatchers currently use.

Primary Technologies: -ThingSpace (Freeboard, mapping) -Nexmo (SMS sending)

Shelter Persona: Have information passed on from dispatchers but also have a need to communicate with logistics to express supply situation, status of people in shelter, and to receive directions on how to house or move those in refuge. This role requires a combination of dispatcher and victim views.

Primary Technologies: -ThingSpace (mapping, Device Messaging) -Nexmo (SMS sending)

Secondary Technologies: -ThingSpace (Freeboard)

Challenges I ran into

Working with SLAM, odometry, and other real-time mapping technology presents a level of hardware complexity.

3D printing took several iterations to account for shrinkage, learning how to use a borrowed 3D printer, and integrating 3D printed parts into the overall design.

Much of the pertinent machine learning and SLAM we needed is predominantly academic, obscured by various language barriers

Our pivot took up a fair amount of time, but it was better to wind up the right place than have time to complete a short-sighted product.

Our initial plan for this hackathon was to make an IoT close-space mapping tool using LiDAR, but we felt like we lacked more than a general use-case for the concept – we struggled to effectively repurpose the Simultaneous Localization and Mapping (SLAM) algorithms and mapping integration from prior plans to emergency rescue in a way that made sense. Thanks to the sponsors challenge, we were able to focus on a specific user-case and context for our hardware hack to work within. SLAMs primary use case is in self-driving, autonomous-navigation vehicle sensors, but we repurposed it for a human-scale, emergency-response context.

Accomplishments that I'm proud of

We pivoted late, overcame hardware shortcomings to display how some emergent technologies can be used, worked through an intense time crunch, and performed in-depth UX research for product development and maturation.

What I learned

This was our first time working with LiDAR and IoT.

We learned how to 3D print parts to mock up field use for our hardware.

We discovered a lot of "slick" development tools and services to integrate into a larger scale vision than what we would normally try at a hackathon.

We really learned that algorithms are hard and probabilistic computation is much more widespread than we thought

Dealing with noise in machine learning is difficult.

Websockets are cool, computers are stupid, etc.

What's next for Huckleberry

We want to work with matrix-based, solid-state, 3D Lidar systems to put LiDAR mapping on par with computer-vision helmet-cam footage both in feature-set and hardware capability.

We would like to continue to mature the project – 24 hours really isn't enough time to build a project of this scale, and 12 hours absolutely isn't enough time to much more than whet one's appetite.

Integrating mobile-based Bluetooth Low Energy beaconless-beacon tech would enhance search and rescue operations and extend our IoT capability.

Share this project:
×

Updates