Inspiration and Product

There's a certain feeling we all have when we're lost. It's a combination of apprehension and curiosity – and it usually drives us to explore and learn more about what we see. It happens to be the case that there's a huge disconnect between that which we see around us and that which we know: the building in front of us might look like an historic and famous structure, but we might not be able to understand its significance until we read about it in a book, at which time we lose the ability to visually experience that which we're in front of.

Insight gives you actionable information about your surroundings in a visual format that allows you to immerse yourself in your surroundings: whether that's exploring them, or finding your way through them. The app puts the true directions of obstacles around you where you can see them, and shows you descriptions of them as you turn your phone around. Need directions to one of them? Get them without leaving the app. Insight also supports deeper exploration of what's around you: everything from restaurant ratings to the history of the buildings you're near.


  • View places around you heads-up on your phone - as you rotate, your field of vision changes in real time.
  • Facebook Integration: trying to find a meeting or party? Call your Facebook events into Insight to get your bearings.
  • Directions, wherever, whenever: surveying the area, and find where you want to be? Touch and get instructions instantly.
  • Filter events based on your location. Want a tour of Yale? Touch to filter only Yale buildings, and learn about the history and culture. Want to get a bite to eat? Change to a restaurants view. Want both? You get the idea.
  • Slow day? Change your radius to a short distance to filter out locations. Feeling adventurous? Change your field of vision the other way.
  • Want get the word out on where you are? Automatically check-in with Facebook at any of the locations you see around you, without leaving the app.


High-Level Tech Stack

  • NodeJS powers a RESTful API powered by Microsoft Azure.
  • The API server takes advantage of a wealth of Azure's computational resources:
    • A Windows Server 2012 R2 Instance, and an Ubuntu 14.04 Trusty instance, each of which handle different batches of geospatial calculations
    • Azure internal load balancers
    • Azure CDN for asset pipelining
    • Azure automation accounts for version control
  • The Bing Maps API suite, which offers powerful geospatial analysis tools:
    • RESTful services such as the Bing Spatial Data Service
    • Bing Maps' Spatial Query API
    • Bing Maps' AJAX control, externally through direction and waypoint services
  • iOS objective-C clients interact with the server RESTfully and display results as parsed

Application Flow

iOS handles the entirety of the user interaction layer and authentication layer for user input. Users open the app, and, if logging in with Facebook or Office 365, proceed through the standard OAuth flow, all on-phone. Users can also opt to skip the authentication process with either provider (in which case they forfeit the option to integrate Facebook events or Office365 calendar events into their views).

After sign in (assuming the user grants permission for use of these resources), and upon startup of the camera, requests are sent with the user's current location to a central server on an Ubuntu box on Azure. The server parses that location data, and initiates a multithread Node process via Windows 2012 R2 instances. These processes do the following, and more:

  • Geospatial radial search schemes with data from Bing
  • Location detail API calls from Bing Spatial Query APIs
  • Review data about relevant places from a slew of APIs

After the data is all present on the server, it's combined and analyzed, also on R2 instances, via the following:

  • Haversine calculations for distance measurements, in accordance with radial searches
  • Heading data (to make client side parsing feasible)
  • Condensation and dynamic merging - asynchronous cross-checking from the collection of data which events are closest

Ubuntu brokers and manages the data, sends it back to the client, and prepares for and handles future requests.

Other Notes

  • The most intense calculations involved the application of the Haversine formulae, i.e. for two points on a sphere, the central angle between them can be described as:

Haversine 1

and the distance as:

Haversine 2

(the result of which is non-standard/non-Euclidian due to the Earth's curvature). The results of these formulae translate into the placement of locations on the viewing device.

These calculations are handled by the Windows R2 instance, essentially running as a computation engine. All communications are RESTful between all internal server instances.

Challenges We Ran Into

  • iOS and rotation: there are a number of limitations in iOS that prevent interaction with the camera in landscape mode (which, given the need for users to see a wide field of view). For one thing, the requisite data registers aren't even accessible via daemons when the phone is in landscape mode. This was the root of the vast majority of our problems in our iOS, since we were unable to use any inherited or pre-made views (we couldn't rotate them) - we had to build all of our views from scratch.
  • Azure deployment specifics with Windows R2: running a pure calculation engine (written primarily in C# with ASP.NET network interfacing components) was tricky at times to set up and get logging data for.
  • Simultaneous and asynchronous analysis: Simultaneously parsing asynchronously-arriving data with uniform Node threads presented challenges. Our solution was ultimately a recursive one that involved checking the status of other resources upon reaching the base case, then using that knowledge to better sort data as the bottoming-out step bubbled up.
  • Deprecations in Facebook's Graph APIs: we needed to use the Facebook Graph APIs to query specific Facebook events for their locations: a feature only available in a slightly older version of the API. We thus had to use that version, concurrently with the newer version (which also had unique location-related features we relied on), creating some degree of confusion and required care.

A few of Our Favorite Code Snippets

A few gems from our codebase:

var deprecatedFQLQuery = '...

The story: in order to extract geolocation data from events vis-a-vis the Facebook Graph API, we were forced to use a deprecated API version for that specific query, which proved challenging in how we versioned our interactions with the Facebook API.

addYaleBuildings(placeDetails, function(bulldogArray) {
    addGoogleRadarSearch(bulldogArray, function(luxEtVeritas) {

The story: dealing with quite a lot of Yale API data meant we needed to be creative with our naming...

// R is the earth's radius in meters
var a = R * 2 * Math.atan2(Math.sqrt((Math.sin((Math.PI / 180)(latitude2 - latitude1) / 2) * Math.sin((Math.PI / 180)(latitude2 - latitude1) / 2) + Math.cos((Math.PI / 180)(latitude1)) * Math.cos((Math.PI / 180)(latitude2)) * Math.sin((Math.PI / 180)(longitude2 - longitude1) / 2) * Math.sin((Math.PI / 180)(longitude2 - longitude1) / 2))), Math.sqrt(1 - (Math.sin((Math.PI / 180)(latitude2 - latitude1) / 2) * Math.sin((Math.PI / 180)(latitude2 - latitude1) / 2) + Math.cos((Math.PI / 180)(latitude1)) * Math.cos((Math.PI / 180)(latitude2)) * Math.sin((Math.PI / 180)(longitude2 - longitude1) / 2) * Math.sin((Math.PI / 180)(longitude2 - longitude1) / 2);)));

The story: while it was shortly after changed and condensed once we noticed it's proliferation, our implementation of the Haversine formula became cumbersome very quickly. Degree/radian mismatches between APIs didn't make things any easier.

Built With

Share this project: