Inspiration

Companies across the globe such as Google, Amazon, and Microsoft develop new solutions and drive the company through an array of meetings and collaborative discussions that enable workers to share ground breaking ideas, receive feedback, and find problems with the current system infrastructure.

This is because it often mixes several people of diverse capabilities and perspectives. Different people think in different ways and will look at an idea in a different fashion, thus enabling them to notice problems that may have previously been overlooked.

We wanted to make sure that:

a) Anyone in a discussion can fully contribute to and understand the meeting, no matter what disabilities.

b) No time gets wasted. Instead of taking notes, there should be a machine that would automatically create an accurate transcript of meeting discussions.

c) Key information is the key : Companies will go through several meetings to reach a business conclusion. Each meeting should be automatically summarized for future use. The important key words talked about should also be remembered.

d) Collaboration is important : All meeting participants should contribute equally to the discussion. There should be an active system that monitors how often did participants talk, and whether their contribution was constructive and positive, or negative and useless.

_ We decided to take the problem into our own hands and create DontCare2020 _

What it does

DontCare2020 is an active meeting monitoring system. After being turned on, it will request each participant of the meeting to introduce themselves. This will allow it to differentiate between all the meeting attendees.

Then, all throughout the meeting it will monitor several aspects of the ongoing meeting. These functionalities include:

  1. Using an NLP based Google API to accurately create a transcript of the discussion containing what sentences were spoken at what time by which participant.

  2. For each participant's sentence of contribution, it will evaluate how positive and productive was the contribution. Then, at the end of the meeting it will display most productive participants. It will base it's decision on how much the person spoke, how often were his contributions actually constructive, and what his role was in the discussed project / topic (Leader, engineer, hardware). It will send this data to the leader of the team having the meeting. This can help him/her to identify who in his team needs to improve their social skills and needs to improve as well as find the people who are talking to much and disallowing other less experienced people to participate.

  3. Throughout the meeting, it will gradually build up a summary of what was discussed in the meeting based on what it deemed most important. It will also come up with some key words that it thinks best sums up the meeting discussion. This information will help track the progress of meetings reaching towards an end idea or decision.

How we built it

This project would have 2 parts : the principal one being the actual back-end of the project, and the secondary one being the front-end interface.

First of all, all recording would be done from any local microphone. To make the transcript, we used the Google Speech to text API. Then, we would overlay that with the recording of the actual voices to separate and tell the difference between different people talking.

All sentences from a user would be summed up, and the percentages of constructive feedback and pertinent talk versus the irrelevant statements. We would also measure the average productivity of the participant.

Then, to create the summary we used python libraries including Gensin. Keyword extraction would be done by ranking the popularity of individual words in the full transcript, then remove all pronouns. Top 5 or 6 keywords would be selected.

Challenges we ran into

Some of the major challenged we ran into were the following:

  1. When testing out the prototype, it's quality took a big plunge in differentiating who exactly was speaking at what time after exclusive limit of 2 people. To solve the problem, we layered the current pre-trained NLP model by google with 2 other elements: a mathematics based algorithm to extract each sentence in the conversation instead of feeding the model the whole recording in one go. The second was a rudimentary SVM to cross-validate the final prediction

  2. It was a complex project with many parts, and 25% of our team-mates left us, meaning we had to work all night to get our prototype running properly. It meant taking over elements that we had little to no experience with such as programming in python and java(script).

  3. Several of the APIs we used did not have an accurate documentation to guide us. We instead had to rely on inexperienced mentors and constant trial and error.

Accomplishments that we're proud of

We are very proud to have fully completed all the shit that wanted to put into the project, with all working ok. Our application is not close to the finish, but we have a good proof-of-concept. We also liked how we managed to do all our processes 100% live, without a GPU.

What we learned

In the hackathon we had the pleasure of learning from several experienced mentors in different programming techniques we could use as well as what libraries may work best for a certain task. I also learnt about optimizing my code to fully make use of a graphic card using CUDA Drivers.

We also learnt valuable life lessons such as how to make use of the multiple people in our team to accomplish process in parallel, just like a GPU :)

What's next for Meeting Assistant

Next, we hope to improve the overall accuracy of our program, and increase the functionalities available.

For example, we could even use advanced NLP to actually enable DontCare2020 to participate in meetings. Some of it's contributions could include reminding people about relevant past events that could help make a decision in the present _ Ex: remember guys, in september we were faced with a similar issue, but it really helped to go over these key points ... let me brief you on what we talked about that day, in case you do not remember_ We also want to package our solution into an app and/or website so that everyone can make use of our hard work,

We believe the future of meetings that make huge impacts on company decisions can be greatly optimized and rendered more efficient, by combining human intuition with artificial memory and calculation speed.

Share this project:

Updates