The prerequisite to work with OpenGLES is having a lot of knowledge on how OpenGLES works. After many hours of peering over tutorials, examples and documentation it's possible to see the forest through the trees.
What it does
In order to have a clearly defined scope, I've chosen the reddit april fools event as dataset. Reddit.com held a 3 day event in which they allowed users to place a color on a 1000x1000 pixel canvas. Each user can place a color every 10 minutes. After 72 hours, a total of 16 million colors were placed by users.
How I built it
I found a repository of OpenGLES tutorials for Android: https://github.com/learnopengles/Learn-OpenGLES-Tutorials. I've forked this repository, cleaned up some code and added my own examples.
I have a separate private repository on Bitbucket that uses the code in these examples together with the reddit data.
Challenges I ran into
- Render call per event was too slow
- Vertex Buffer Objects using position and color were too large (504 bytes per event)
- Index Buffer Objects in OpenGLES 2.x uses a short for indices (only ~64k unique events)
- Allocating buffers uses an integer (maximum buffer can only hold ~2 billion bytes)
What I learned
It's very easy to use OpenGL in a wrong way, and it's more often than not unclear what the issue is.
What's next for Visualising large datasets in OpenGLES
Techniques I haven't tried yet:
- Uniform Buffer Objects
- Instanced rendering