The idea was to take the freshly released GNAF data and turn it into something useful for people or companies.

What it does

It turns a large volume of addresses into an system that can be quickly and simply searched, returning an address, including it's location, very quickly.

How I built it

I loaded up the data into a database, built appropriate data structures and indices, and configured an API to provide access to it.

Challenges I ran into

Data volume. While the search is working fine on the 'aliases' type of addresses, the 'principal' addresses are still getting set up. It's a very large table and there might be issues with postgres running out of space/memory.

Accomplishments that I'm proud of

Well, it works.

What I learned

Some unexpected things that shouldn't have been hard were hard, like getting CORS working on the server using Docker images. I spent way too long getting that not happening and ended up using native nginx...

What's next for Gnaf Address Search

I'll need to add the rest of the addresses, and then see how it can be made useful to service providers. Perhaps they want addresses split up into components, such as house number and street name/street type. This service could provide a single text box for users to find their address, and return a data structure that a website can use to fill in a variety of fields. Automating the infrastructure, as well as simplifying the databse, would enable it to scale, and to handle millions of queries.

Built With

Share this project: