> Forums > CM News

Facebook testet Racheporno-Abwehr durch Nacktbild-Hashing

Erstellt
Nov. '17
letzte Antwort Antwort
1
Aufrufe
783
1
„Gefällt mir“
Abos
Noch keine
Mi., 08. November, 2017 um 8:18
#1

Wer verhindern will, dass ein bestimmtes Nacktbild als Racheporno-Posting auf Facebook erscheint, braucht sich künftig genau dieses Bild nur selbst im Messenger schicken.


Diese C&M News kannst Du hier lesen:
https://www.ress.at/-news08112017081831.html


Fr., 10. November, 2017 um 9:32
#2

In diesem Post hat Facebook alle Details zur neuen Technik verraten:
https://newsroom.fb.com/news/h/non-consensual-intimate-image-pilot-the-facts/

Zitat:
The Facts: Non-Consensual Intimate Image Pilot
By Antigone Davis, Global Head of Safety

We don’t want Facebook to be a place where people fear their intimate images will be shared without their consent. We’re constantly working to prevent this kind of abuse and keep this content out of our community. We recently announced a test that’s a little different from things we’ve tried in the past. Even though this is a small pilot, we want to be clear about how it works.

This past week, in partnership with the Australian eSafety Commissioner’s Office and an international working group of survivors, victim advocates and other experts, Facebook launched a limited pilot in Australia that will help prevent non-consensual intimate images from being posted and shared anywhere on Facebook, Messenger and Instagram. Specifically, Australians who fear their intimate image may be shared without their consent can work with the eSafety Commissioner to provide that image in a safe and secure way to Facebook so that we can help prevent it from being shared on our platforms.

To be clear, people can already report if their intimate images have been shared on our platform without their consent, and we will remove and hash them to help prevent further sharing on our platform. With this new small pilot, we want to test an emergency option for people to provide a photo proactively to Facebook, so it never gets shared in the first place. This program is completely voluntary. It’s a protective measure that can help prevent a much worse scenario where an image is shared more widely. We look forward to getting feedback and learning.

Here’s how it works:

  • Australians can complete an online form on the eSafety Commissioner’s official website.

  • To establish which image is of concern, people will be asked to send the image to themselves on Messenger.

  • The eSafety Commissioner’s office notifies us of the submission (via their form). However, they do not have access to the actual image.

  • Once we receive this notification, a specially trained representative from our Community Operations team reviews and hashes the image, which creates a human-unreadable, numerical fingerprint of it.

  • We store the photo hash—not the photo—to prevent someone from uploading the photo in the future. If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches we do not allow it to be posted or shared.

  • Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner’s office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers.


Here’s what some of the experts from our working group have to say:

  • “Facebook is working with victim advocates to provide a secure voluntary mechanism for a person to provide the photo to a specially trained professional who can then use technology to prevent it from being shared broadly. This is a far better option than having these images shared with their friends, family members, colleagues or the general public with the intent to shame and embarrass them, and the horrible consequences that ensue. If you’ve never tried to end a relationship with an abusive, controlling, and violent partner, there is no way you’d understand the very real terror victims feel of how much damage an abuser can and will do by sharing intimate images. This voluntary option provides another tool to victims to prevent harm.” – Cindy Southworth, Executive Vice President and founder of the Safety Net Technology Project

  • “Facebook has been working with victims, victim support providers and other issue experts on this effort. This is a complex challenge and they have taken a very thoughtful, secure, privacy sensitive approach at a small scale with victim advocates on the frontline. They are working to identify the best way to help people in a desperate situation regain control and prevent abuse that has severe consequences, including the loss of employment, loss of friends, not to mention the intended embarrassment and humiliation.” — Danielle Keats Citron, Morton & Sophia Macht Professor of Law, University of Maryland Carey School of Law, author of Hate Crimes in Cyberspace

  • “Facebook has been at the forefront of the tech industry’s efforts to develop innovative and efficient responses to the problem of nonconsensual pornography. It was one of the first major social media platforms to prohibit the unauthorized sharing of intimate images and has sought, in addition to implementing platform-wide policies and tools to fight this abuse, to provide victims with a range of voluntary options to protect themselves. Many victims live in fear of having their intimate images exposed, and this pilot program allows those victims to act preemptively against this threat.” — Mary Anne Franks, Professor of Law, University of Miami School of Law; Legislative & Tech Policy Director, Cyber Civil Rights Initiative


We look forward to getting feedback from our community to learn the best ways to keep tackling these difficult issues.


> Forums > CM News

Du hast bereits für diesen Post abgestimmt...

;-)



Logo https://t.ress.at/ebWxD/


Ähnliche Themen:











Top