Inspiration
Social media platforms use algorithmic recommendations to tailor content for users, but these algorithms often prioritize engagement over well-being. As a result, users searching for neutral or positive topics—such as healthy recipes or fitness tips—are frequently led down a path toward more extreme or potentially harmful content, such as weight loss obsession, unrealistic body standards, or misinformation. This creates a cycle where users are exposed to content they did not explicitly seek, influencing their perceptions and behaviors in ways they may not fully recognize. The lack of transparency in recommendation systems leaves users with little control over what they see, increasing the risk of negative mental health impacts, misinformation spread, and reduced digital autonomy. A solution is needed to provide users with clearer insights into the content they are consuming, enabling them to make informed decisions about their digital experience.
What it does
Our solution aims to integrate with social media platforms and provides users with: Neutral summaries of posts they see, free from algorithmic bias. Content transparency by revealing patterns in the types of recommendations they receive. User control over their feed by allowing them to decide if they want to continue seeing similar content or adjust their experience.
How we built it
Figma for prototyping
Built With
- figma
Log in or sign up for Devpost to join the conversation.