Golem Face blurrer
Why?
According to ChatGPT:
Privacy Protection in Public Recordings: For videos captured in public spaces where individuals have not consented to be filmed, blurring faces can protect their privacy when such videos are shared publicly or used in news broadcasts.
Anonymity in Sensitive Content: In interviews or documentaries featuring sensitive topics, subjects may wish to remain anonymous. Blurring faces can help protect their identities.
Security Footage Redaction: For security or surveillance footage that needs to be shared with law enforcement or the public, blurring faces can protect the privacy of innocent bystanders or victims.
Data Protection Compliance: In regions with strict data protection laws (like GDPR in Europe), blurring faces in videos can help organizations comply with regulations concerning personal data and privacy.
Educational Content: When educational videos include students or minors, blurring faces can protect minors' identities, especially when such content is shared online.
Research and Development: Researchers studying public behavior or crowd dynamics might use videos with blurred faces to ensure participants' anonymity.
Journalism and Reporting: Reporters might blur faces in videos to protect sources or individuals in sensitive or dangerous situations.
Social Media Content Moderation: Platforms may use face-blurring tools to automatically redact faces in videos uploaded by users, particularly in contexts where consent to film and share is unclear.
Protecting Witnesses or Whistleblowers: Videos featuring witnesses or whistleblowers can have faces blurred to maintain their safety and confidentiality.
Blurring Faces in User-Generated Content: For apps and services that allow users to upload video content, providing an option to blur faces can help users maintain privacy and control over their digital footprint.
Video Redaction for Legal Proceedings: In legal cases where video evidence is presented, faces may need to be blurred to protect the identities of certain individuals.
Content Filtering for Children: Blurring faces in videos that might not be appropriate for children can be an additional layer of protection in content filtering systems.
Ethical AI Training: For AI and machine learning projects that use video data, blurring faces can address ethical concerns related to privacy and consent in training datasets.
Corporate Training Videos: In internal training videos where employees are featured, blurring faces can maintain privacy, especially if such content is at risk of being shared beyond the intended audience.
How it works
The golem provider node downloads the image with the user specified hash, and the video is uploaded to the provider. A python script that uses OpenCV and face_recognition is executed on the provider node and this produces the video with blurred faces. Once the processed video is downloaded back from the provider node, it is displayed in the browser to the user.
Known issues
- Sometimes it just doesn't work, and restarting yagna fixes it
- This tool only uses the testnet, and uploading videos larger than the ones included in this repository may not work out of the box that way
Story
It originally did not use OpenCV, but instead just used ffmpeg to blur the images, this resulted in a pyton script that takes the output from the cli face_detection and generates a bash script with a lot of ffmpeg commands to blur all images. However, this did not work for unexpected reasons, it seemed like the task did not finish executing most of the time.
There was also another variation of the ffmpeg version that attempts to parallelise the blurring, but this required uploading the video to every provider, and that combined with the complexity to debug it outweighed the benefits.
Log in or sign up for Devpost to join the conversation.