Inspiration
A lot of dating apps these days rely on people constantly swiping through thousands of options, making it hard for people to match with someone or to form a genuine connection. BlindSpot aims to reduce this issue by having people form the connection before they get a chance to say no, without knowing what they could be missing out on.
What it does
BlindSpot is a mobile dating app that uses 5-minute "blind" video dates. How this works is that users are paired instantly based on their common interests that they've selected when signing up. The video feed is heavily blurred using a silhouette filter, so that way you can't be quick to swipe on the person before getting to know them. You can hear each other clearly and see reactions, but you cannot see specific facial features.
To prevent awkward silences, the app provides a question dock of conversation starters. It also has a "Shared Vibes" feature that highlights common interests.
After the timer hits zero, both users must vote. If both users click yes to reveal, then the blur dissolves and the users can see each other, revealing the profiles and allowing them to chat.
Once matched, users enter a standard chat interface to plan a real-world meetup. This gives users the option to continue the relationship they built in the first 5 minutes, and creates a sense of suspense that has been lost in these modern dating apps.
How we built it
I built BlindSpot using React Native with the Expo framework so it can be available across IOS and Android. For the frontend, I used Expo Router for file-based navigation and TypeScript to ensure type safety across our component architecture. The blind filter was made using expo-blur layered over dynamic Image components. I utilized the Animated API to handle the heartbeat pulsing effects on the home screen and the floating emoji reactions during calls. This helped make the app more visually appealing and encouraged people to go on a blind video date. I implemented Zustand with AsyncStorage to handle user authentication, user interests, and profile settings, ensuring the app remembers who you are and what you like even after a restart. I used a dark mode theme with translucent gradients, using expo-linear-gradient to give it a clean design.
Challenges we ran into
It was difficult to design a video call screen that felt engaging without actually showing a face. I thought that people might get bored if they were just staring at a black screen, since there's nothing to interact with. I solved this by adding the floating reaction system and the shared interests banner to give users something visual to focus on. I also made sure that there was a blur feature on the video so that it keeps the user interested in what the person could look like, and makes sure that they don't just make a quick judgement.
I also initially struggled with the authentication flow, since users were getting sent back to the home screen immediately after logging out. I fixed this by implementing a AuthStore that correctly clears session tokens before redirecting the router.
Accomplishments that we're proud of
I'm proud that the app looks professional and has an overall easy to navigate design. The switch from the standard UI to the dark one makes it feel easier on the eyes and more like a real product. I'm also proud of being able to implement the logic where the app checks your tags against a match's tags and displays "You both like: Dogs" during the call. This helps with the user experience and also makes sure that the people the user is getting matched with also has similar interests compared to them. I also feel like the flow from all of the screens feels smooth and easy to use.
What we learned
I learned that by intentionally adding the blur and the timer, I actually increased the value of the match. I thought this could discourage users from wanting to use the app, but then I thought it creates a sense of suspense and counters the aspect of constantly swiping through people, and allows for users to meet various kinds of people and connect with others easier. I also gained more experience with Expo Router and more knowledge about handling complex state synchronization between screens in React Native.
What's next for BlindSpot
I think the next steps for BlindSpot could be allowing users to record a 10-second audio hook that plays while they are in the waiting queue. This could help the person get a better sense of what the person is like and could keep them more engaged. Another feature I would like to implement is an activities section where after users match, they can be brought to a tab that suggests activities they should do based on their interests that they put in the app. This will give them something to do and could expand their relationship further.
Built With
- expo-blur
- expo-image-picker
- expo-linear-gradient
- expo-router
- expo.io
- javascript
- react-native
- react-native-animated
- typescript
- zustand
Log in or sign up for Devpost to join the conversation.