Disasters - natural or unnatural can be a terrifying experience for the people in it. Helplessness and the need to be rescued and brought out of the horrifying experience creeps in. This motivated us to develop an application to have a remote node at the disaster locations that could identify the people stuck in the disaster premise and send a live stream of the premise to be better prepared for disaster management and people rescue operation. This involves the use technology of Unmanned Vehicles and the technology provided by Freewave to contribute to disaster management use cases like the Californian forest fires incidents.

What it does

To cater to providing better disaster management strategies, we use the RPi Camera and FLIR camera connected to Raspberry Pi to stream a still image for facial recognition and a Video streaming of the premises on a Web Application that runs on a Flask server. The Web application displays essential characteristics about the image such as ethnicity, gender, age and other characteristics. It also identifies temperature profile of the location from the live video stream

How we built it

Development Language: Python

The prototype comprises of two Freewave Dev Kits, one that is directly connected to the Ethernet and the other that operates a remote node. The remote node is connected to a Raspberry Pi that has two types of cameras connected to it - The RPi Camera and the FLIR Camera from Sparkfun. To be precise, the chain is as follows:

         Raspberry Pi1              ->  Remote Freewave   ->    Freewave Node 2           ->  Remote Flask Server running  

(With RPi Camera and FLIR) Node 1 connected to Ethernet Web Application

The Raspberry Pi1 sends 120 frames of IR image and 1 frame of RPi image that could be used for facial recognition over UDP to the Freewave Node 1. This Remote Node1 then communicates the frames over UDP to the Freewave central Node 2. The central node 2 then forwards these to a server running on a host machine over UDP to stream the frames on the Web Page that could later be used for planning out the rescue operations and to get data about the people in the disaster. The 120 frames sent over UDP are merged to generate a Live stream of the premise.

We also connected the joystick to another Raspberry Pi to control camera movements mounted on a remote node.

Challenges we ran into

  1. Bandwidth issues
  2. I/O Interfaces: Less interfaces in Freewave Node demanded the use of Raspberry Pi to collect camera data
  3. Joystick - Requirement of a Web GUI by pygame to indicate the joystick movements made debugging the code on Putty difficult
  4. Running Flask Server on the Freewave Node could not be a solution since Web GUI on a Flask Server was being host on Local Host
  5. Synchronization Issues due to communication between multiple radios and wireless interfaces

Accomplishments that we're proud of

  1. A working prototype to enable better disaster management and rescue plans.
  2. Successful IR Video streaming

What we learned

  1. Interfacing Freewave Boards for a very novel application in a span of 24 hours
  2. Processing IR Camera data for meaningful temperature sensing from a node with no ethernet connection to it
  3. Demonstrated video streaming over limited bandwidth

What's next for us:

  1. Include Servo motors in the remote node to control the drone
  2. Improve image processing to extract more data to make use effective use of radio bandwidth

Built With

Share this project: