Inspiration
We have research about the most traditionally ineffective practices around Enterprises/Tech Companies. Traditionally, there is a sort of culture to have script inside code to automate the process of deploying the application onto the web server using .drone.yml usually which is very ineffective because in this process, businesses have to buy a lot of servers to host their applications. We noticed this problem and brainstormed on this and come up with technologies to work with. For second part of the project 'logging system', we talked to people about server logs in UC. They are paying huge amount of to get logs for their systems so we wanted to find a better solutions, which is free and effective in collecting relative data.
What it does
It improves the automation process by putting applications into docker container by making image of those applications. Then store that image onto google-cloud-platform and use it to tell Kubernetes which application to deploy on one main Kubernetes sites. In this way, web application can be host onto one Kubernetes sever and it will look something like this 'Kubernetes:service1', 'Kubernetes:service2'. Then the next part is our logging system by having setup a APM server to make connection between logging system and web applications. In this way multiple web applications can communicate to logging system and send hosting system and service logs.
How we built it
We started with created sample javascript application from where we can track multiple logs and traces. Then we made script inside app.js to connect the application to Kibana logging system. We tested it throughout the process locally. Once we see that logging system is connected to app then we created dashboards and visualizations to get relevant data. Then we started working on deployment process. First we created a script to make image inside Dockerfile then we build image out of it which we then stored it into Google cloud platform. We were making changes to our applications to and making sure that was it updating the image or not. This is where versioning control part plays out. Then we created Kubernetes on command line where we would be hosting our applications. We ran through a lot of steps to set up. Then we created new app for our application along 3 pods and deployed our application onto it. Then we keep making changes to our application and it keeps updating the application.
Challenges we ran into
- It was so difficult to figure out how to connect Mongodb database to application because we are using login page for the applications.
- Figuring out how to get virtual box on Mac for Kubernetes to work on.
- Minikube was going down so we had to figure out that as well to how to keep it running.
- To setup everything inside multiple docker container so if one thing goes down, other will be up
- To configure the logging system to get relevant data which would actually be meaningful to developers and system admin person and also clients
- Use of bootstrap in our application gave us a hard time.
Accomplishments that we're proud of
- We accomplished the way to host multiple services onto one server using automation docker deployment process
- get the logging which traces
What we learned
- create automated logging system from scratch
- Working around Docker and google cloud platforms
- Learned how to store docker images into google cloud platform and using it globally for deployment
What's next for Automation Logging
- Enterprise companies can implement this automation deployment process and the logging system into their development and staging environment.
- To help business to monitor their systems and microservices
Log in or sign up for Devpost to join the conversation.