Inspiration

URL shorteners are an every day tool when surfing the web. Especially if a link contains a lot of information, like for example something you searched for on a site, the URL can get incredibly long. Since members of the Hawaii State Government aren't allowed to use existing commercial shorteners, a secure and easy to use solution had to be found. This is exactly what this project achieves.

The desing for the compression dialog form was inspired by Discord's design for server invites. Limited uses as well as an expiration date are great additions to set this URL compressor apart.

What it does

The website is divided into three major components - header, footer, and the main panel. The header includes the website's title and the logo (it's a logo, not a button). The footer provides a link to the GitHub repository. The main panel is the focus point of interaction. The first input line is for the URL that shall be compressed. It only takes valid URLs as input (include the http:// or https://). The second field determines the time til this shortened link expires. The default value is 12 hours from once you hit compress on. It's a dropdown, and you have to select an option. The third field is limits the maximum number of uses for a link. This is set to "No limit" by default. If it is set to any other value, that link can only be used so many times. This is great for e.g. one-time access. Afterwards you just hit the "Compress URL!"-button and your link is generated. It will appear in the field "Compressed URL". If you click on it, it automatically selects the whole link. Otherwise, you can hit the button "Copy!" and the link will automatically be copied to your clipboard. This is indicated by the buttons text changing to "Copied!" for two seconds. The website works on mobile devices as well - in portrait and landscape mode.

To access the demo location simply click on the provided link

How we built it

A URL Compressor works by mapping a long URL onto a shorter one. How to map these URLs is up to the system's designer.
In this project the core idea is to map each incoming URL onto a six char base62 string. Furthermore, users have the option to set expiry dates for their links as well as limiting their number of uses.

Project structure

For clarity the project has been divided into the client and server code. (Even though for deployment purposes the built client applications is bundled into the server.) This project follow the Model-view-controller (MVG) design pattern. Usage of this pattern allows for easy extendability as well as improving readability - a place for everything, and everything in its place.

Client

The client code only handles the website as well as sending the users request for a new shortened link to the server. No critical functionality is executed clientside. The website itself broadly follows the State of Hawaii's Web Style Guide and is inspired by the Makai theme. Furthermore, it is optimized for mobile devices.

Server

The server is basically an express REST api. On creation requests the input constraints are checked, and if met a new shortened url is generated and stored in the database. On shortened url requests the specified value is looked up in the database. If it is found the user is redirected to the associated url. If not the user is redirected back to the main page and receives an error message. On redirect to the associated url the use count is diminished by one and the entry deleted, if now no uses are left.

The following two sections describe two critical elements of the system in greater detail:

Random ID

To compress the URL we represent each site as a six char value in base62. This allows for up to pow(62, 6) = 56,800,235,584 distinct addresses. Furthermore, base62 is still easily read- and typeable. Current estimations put the amount of websites globally at roughly 2 billion. So this is more than sufficient. Generation of these values is handled serverside to prevent tempering like intended collisions. To prevent cases of naturally occurring collisions after generation of a value a request for the associated redirect is made. Should this not turn out a 404 the value will be regenerated. Applying this procedure is still efficient, because natural collisions should be fairly seldom.

Data storage

The project utilizes SQLite for the database. SQLite is a lightweight sql database. The database contains only one table. Besides the URL each entry holds the assigned shorthand id value, an expiration date, as well as a use counter. By default, links are set to expire after twelve hours and can be used an unlimited amount of times. Every time you access a shortened url the count is diminished by one (if you have set a finite count). Once an entry has expired, or it ran out of uses the corresponding entry in the database is automatically deleted. This limits the database's size, the chance for value collisions, and misuse for the propagation of harmful URLs.

Challenges we ran into

During the course of this project I ran into three major challenges.

Connecting the frontend to the backend

The first real challenge to overcome was to establish a connection between the front- and back end. This proved quite difficult at the start because my requests from the client weren't coming through to the server. Though if I tried to post to the server using postman, it worked just fine. I overcame this challenge by utilizing the query parameter.

Cors

The second major obstacle was the port difference between the client and the server. This resulted from my realization of the client as a vue application. Because vue hosts itself, I had to host the server on a different port. When I tried to invoke a redirect from the server I'd get a cors error. Cors is a protection against cross-site scripting. It only allows programatic loading of URLs from approved sources. I eventually overcame it by redirecting only from and to the server. But in production both client and server are run by the same process on the same port, so there it isn't an issue anymore.

Reuniting server and client

Which brings us to the last major challenge. I was faced with reuniting the server with the client so that it could be run on a single port. Luckily vue provides the functionality to built your application and use those generated files on their own - bundeling all the node dependencies into one file. I only had to load the generated files from the server and everything went smooth on the first try.

Accomplishments that we're proud of

In this project I'm really proud of two things in particular. Firstly the codebase turned out fairly small. Especially since I used a framework like vue (which is not necessarily light-weight) I was surprised how compact everything got. The second thing I'm really proud of is that even though the project is that small I stuck to my design pattern rigorously. Smaller projects often invite to become a bit sloppy with the application of design patterns. It is really tempting to just put everything in one (not that) big file. That I managed to stay my path and pulled of the design pattern through the whole project is something I'm personally proud of. (Largly because I tended to get a bit loose with my patterns on former projects.)

What we learned

In total I learned how to develop and deploy a full-stack application using node-js. (I've developed full-stack apps before, but using php and plain javascript.) Doing all of this in such a short time felt like a crash course. I especially liked the designing of the api. It was surprisingly easy and elegant. Furthermore, I learned that next time I won't have two separate node-js applications. Instead I'll be creating only one, containing both the client and the server. Through this I'll be able to access data more efficiently and easily. Moreover, it removes redundancies like having to keep two config files - one for the server and one for the client.

What's next for Orangehill - URLCompressor

The next step in the development of "Orangehill - URLCompressor" would be an upgrade of the user interface. Right now it is rather plane and might not conform completely with the states design guideline. Especially the header and footer could use some work. (As you might have noticed, I'm no expert in website styling.) Furthermore, authentication and admin tools can be easily added in due to the modular design approach. Apart from this testing and bug hunting would be on agenda. Tests are completely missing from the current implementation, which is bad practice (but caused by the short development time). This would help with finding bugs, as well as providing security expanding the project further. The tests are a great tool to keep existing functionality undamaged.

Built With

Share this project:

Updates