Simple Projectile Locating and Aiming Tower (SPLAT) - a Close in Weapons System (CIWS) for Short Range Missile Defense by Vikram Bala and Andy Liu. ESE350 Spring 2022 Final Project.

GitHub Repo: https://github.com/upenn-embedded-courses/ese350s22-final-project-group4

Public Version: https://github.com/vbala29/SPLAT-CIWS

Overview: Our project is intended to be a mock example of the close in missile defense systems used in the real world, such as the U.S.'s PHALANX CIWS. We took a computer vision based approach to projectile tracking, and created utilities such as an LCD radar display and remote website command to monitor information collected about the projectile and control the turret's defense weapon (a laser in this case). Our goals included minimizing latency in tracking the object, writing comprehensive libraries and peripherials that could be used with various parameters and use cases, and ultimately creating a working demo within the time constraints of just a few weeks.


We are trying to explore the problem of close in weapons defense against short range missiles, small boats, and aircraft. Longer distance measures employed by the U.S. Missile Defense Agency such Terminal High Altitude Area Defense and Ground Based Midcourse Defense are intended for ballistic missiles. However, short range weapons like surface torpedoes and anti ship missiles must be taken care of with different measures. One of the last resort measures used is known as the Phalanx CIWS system, which stands for close-in-weapons-system, and is used on many U.S. naval ships. It is also used in a ground based variant, most famously in the U.S. embassy in Baghdad, Afghanistan where many YouTube videos recorded it defending against a rocket attack in January this year. The concept of the Phalanx CIWS is especially relevant now, given the heightened desire for many countries around the world including Israel and the UAE, to defend against short range rocket attacks from neighboring countries.

This project is interesting as we are trying to see if we can create a mini version of the Phalanx CIWS using computer vision to act as the radar technology, and servo motors for the precise control needed to aim at fast moving targets—especially given the close distance of the targets to the weapon system. The purpose of our project will be to create a system consisting of a camera, turret with a laser mounted on it, and LCD radar. The laser will depict where the turret is currently firing at, as we will not be implementing a firing mechanism on our turret. The system should be able to not only track and display on the radar LCD a projectile as it moves across the field of vision of the camera, but also turn the turret/laser to point at the target as it moves—simulating the firing of projectiles to eliminate the target. Finally, the project will contain a remote, web-based control system which will display information about the turret position as well as enable or disable the laser on the turret.

How to Run Code

src/main.c serves as the entry point for the code running on the Atmega328p that controls the Servo motors and is connected via UART to the computer vision computer. src/LCD_main.c serves as the entry point for the code running on the Atmega328p that controls the TFT-LCD radar display and is also connected via UART to the comptuer vision computer. The ESP8266.ino file can be directly uploaded to the esp8266 and has no other dependencies in this repository, though libraries for the ESP8266 Web Server must be apropriately downlaoded.

StereoVision/3d_ball.py serves as the central code for the computer vision logic. Because of the pregenereated stereoMap.xml calibration parameters, directly running the code will start up both cameras as well as transmit location and angle data to both arduinos used in the system. Modification of the "COM" ports used to transmit serial information may be required, as well as adjusting of the "left" and "right" camera indices used.

To re-calibrate the cameras, run StereoVision/gen_calibration_images.py and hold up a 10 square by 8 square black-and-white chessboard image. Be sure that it is in the frame of both cameras, and press "s" when it is. Repeat multiple times while rotating the image so that it is in different planes. Afterwards, run StereoVision/create_cal_params.py to generate stereoMap.xml, the calibration parameters required. Finally, run StereoVision/3d_ball.py as described above, which will automatically adjust for the newly generated parameters


S.P.L.A.T CIWS Full Demo Video https://youtu.be/s5mFUPCWUQY

S.P.L.A.T CIWS Turret In Action (Short Video) https://youtu.be/M7niDRb0H9w

The Turret System, Cameras, and Radar Operator LCD The Turret System with Cameras and Radar LCD

The Remote Command Website used to control Turrert Laser and Monitor Real Time Position of Turret The Remote Command Website for the Turret


Servo Turret System (Relevant Files: src/Custom_Servo.c, include/Custom_Servo.h, src/main.c)

We wrote a servo motor control library for the SG90 servo motor; the datasheet can be accesed here. Note that this library was limited to the use of one 8-bit and one 16-bit timer on the Atmega328P microcontroller (see header file for the specific output PWM pins for the servos). This meant that both servos were not able to operate with the same level of precision in movements. Using this library, in main.c the system clock was prescaled by 1/2 to bring the frequency to 8MHz; however the library can operate at any CPU frequency so long CPU_FREQ and PRESCALE macros in it's header are changed.

UART Serial Rx/Tx Library (src/uart.c, incluce/uart.h)

We wrote a UART serial communication library for the Atmega328P to both receive and transmit data packets in between other microcontrollers, like the ESP8266, as well as the computer vision system running on a laptop. The baud rate was set to 9600, alhtough changing the BAUD macro and CPU_FREQ macro in the uart.c file allows for this baud rate to be chaged. In addition, we configured out protocol to use 8-N-1 frame formats (8 data bis, 1 stop bit, no parity bit). The library supplies the ability to enable an interrupt service routine (ISR) on the Atmega328P that is triggered when new serial data is received. We used this ISR capability to ensure low latency between the object's coordinates being transmited and the servo motors receiving a pwm signal command from the microcontroller.

TFT-LCD Radar Display (Relevant Files: include/LCD_GFX.h, src/LCD_GFX.c, src/LCD_main.c)

We wrote a LCD graphics library for the ST7735R. The datasheet can be accessed here. This library uses a standard Serial Peripheral Interface (SPI) communication protocol to write real time positional data about the tracked object to our LCD, thus making it akin to a "radar display" of objects around the turret. The red dot displayed in the demo video at the bottom of the LCD is to depict the position of the turret, while the blue dot shown is the position of the object being tracked relative to the turret, as if someone was facing in the direction that the cameras point at. We used a separate Atmega328p to control the LCD, due to the latency that writing to the LCD would cause if done on the same Atmega328p being used to control the servo motor turret. The LCD controlling Atmga328p received the positional data from the computer vision system using a UART based serial communication protocol (see Serial Library Component).

Stereo Vision System (Relevant Directory: /StereoVision)

We used a two camera stereo vision system to compute depth, a necessary measurement for accurate aiming of the tower. We calibrated the two cameras using Zhang's method, allowing us to compute depth measurements when an object is detected on both cameras. This, in addition to measurements of horizontal and vertical distance, allowed us to calculate pitch and yaw angles of the "projectile" reference to the laser, after offsetting for the laser's initial position. Simple trigonemetric calculations given a (horizontal distance, vertical distance, depth) coordinate were used to compute this.

To detect and identify a ball, an HSV color mask was used after capturing an image from the camera. Eroding and dilation were used along with a Gaussian blur to reduce background noise and interference with the filter. Finally, a circular contour was identified and treated as the outline of the ball, the center point of which is used as the location of the projectile.

ESP8266 Web Server (Relevant Files: esp8266/esp8266/esp8266.ino)

We used an ESP8266 microcontroller to host a webserver that supplied information to the user about the current aiming position of the turret (see media above), as well as the ability to turn on and off the laser on the turret. The website was written using HTML/CSS/JS along with some basic api requests in the website back to the ESP8266, such that as new data was received over UART by the ESP8266, the website would asynchronously request and receive this new data to be displayed. The website code is inside of the .ino file in a C-string. There may have been better ways to host the code besides this, but given the simplicty of our website, and the time constraints of the project we didn't focus on this. This device is also wired to an input capture pin on the Atmega328p to send information about controlling the laser. In addition, we used a voltage divider to step down the logicl level from 5V to 3.3V when sending serial data from the Atmega328p to the ESP8266.

Built With

Share this project: