Data scraping,Ethnography are most successful techniques to get data with high distribution and uncertainty. The extracted data is random and to process the data is difficult.Using AWS cognitive services like rekognitoin,comprehend the data can easily clustered by different entities,phrases and content.

What it does

Large dump of data is structured based on the point of interests like locations ,names,years,sentiment,content,etc using AWS comprehend and Rekognition for image tagging.Help ful for social causes like trafficking where data can be scraped from suspicious websites and data can be structured and information is extracted.

How I built it

AWS SAM,Lambda,Comprehend,S3,Rekognition

Challenges I ran into

Accomplishments that I'm proud of

What I learned

What's next for Glim

Create perfect structured data from dumps of data for analysis and insights

Built With

Share this project: