HN Gopher Feed (2017-11-10) - page 1 of 10 ___________________________________________________________________
Non-Consensual Intimate Image Pilot
39 points by JumpCrisscross
https://newsroom.fb.com/news/h/non-consensual-intimate-image-pil...___________________________________________________________________
LukaAl - 1 hours ago
As other has pointed out, this is extremely creepy. The easiest
solution would have been this one:- The user uses a JS solution to
hash the images on the client, without the image being uploaded-
She compiles a form with additional information (e.g: capture her
account, reasons for uploading, suspect person sharing the
picture)- The picture is saved in the DB as un-verified revenge
porn.- The first time someone uploads a picture that matches the
hash, the pic is quarantined and the specially trained individual
manually check them- A scoring system could be used to check the
reliability of the submission. If multiple photos marked revenge
porn get rejected, the control becomes ex-post. For even more
violations, the user get banned from using the tool and should
directly contact Facebook. Submitting the same hash that has been
rejected, will count as a "red mark"Now, I understand this system
is very complex, what Facebook has done is an MVP and as a product
manager, this is what I prefer. But considering the issue (revenge
porn, not something I necessarily want to test the impact on
retention :-) ). Also, yes, it requires resources, but Facebook has
a problem with trust lately, better to do the best...[edited for
formatting]
the8472 - 25 minutes ago
> - The user uses a JS solution to hash the images on the client,
without the image being uploadedYou have to trust facebook in
either case, each time you do it. Either to handle your nude pics
properly or to serve you javascript that does what they claim it
does, every single time.On the other hand an open source desktop
application only needs to be audited once and then can be
validated based on a hash.In browser crypto is not a solution if
you want to minimize the needed trust.
oh_sigh - 24 minutes ago
Nobody wants to run a desktop application given to them from
facebook
the8472 - 17 minutes ago
> only needs to be audited once and then can be validated
based on a hash.I thought I covered that concern, but I
neglected to mention that it should be open source so
everyone can audit it.
evgen - 1 hours ago
Now figure out a way to do step #1 (user does a hash client-side)
without making it trivial for someone else to create a filter
that adds enough noise to invalidate step #4 (uploaded pics that
match a hash are quarantined.)
wjh_ - 1 hours ago
PhotoDNA would be an option, I imagine. Or at least something
similar!https://en.wikipedia.org/wiki/PhotoDNA
moyix - 18 minutes ago
Honestly, I doubt that most of these sorts of algorithms
would survive concerted attacks ? that's why they tend to be
closely guarded.Alex Stamos (Facebook's CISO) implies this is
why they can't do it client-
side:https://twitter.com/alexstamos/status/928646228472078336
LukaAl - 42 minutes ago
Agree, that's a problem but there are options to solve it. Look
at PhotoDNA by Microsoft [0]. But it is a second step. First,
you need the reporting properly done.[0]
https://www.microsoft.com/en-us/photodna
jsjohnst - 42 minutes ago
This is already a solved problem at FB (PhotoDNA does this for
them for CP images).
tree_of_item - 40 minutes ago
How does giving Facebook the image solve that problem, in a way
that can't be done client side?
alexggordon - 2 hours ago
> To establish which image is of concern, people will be asked to
send the image to themselves on Messenger.> Once we receive this
notification, a specially trained representative from our Community
Operations team reviews and hashes the image, which creates a
human-unreadable, numerical fingerprint of it.This clearly
implicates that Facebook has, almost without reserve, the ability
to read messages from user to user (even though it might limited to
messages to oneself), and exposes that ability to its
employees.While I stopped using Facebook a long time ago,
Zuckerberg's quote, "they 'trust me'; dumb fucks" seems relevant
here. If I were Facebook, I would do this in a different, much more
privacy conscious way.1. Obviously, Facebook is already hashing all
images on its platforms and storing the hashes. Given that, let the
user, using a frontend only platform, upload an image and generate
the hash. Make sure this image is a real photo (not a meme or
something heavily photoshopped) in addition to making sure:1a) The
image contains a human[0]. Fairly easily doable for most
situations.1b) If not, let the user know that "we can't
automatically detect a human in this photo, do you want to still
submit this photo for removal, this could get a strike against your
account if (not in some list of reasons for removal)". Maybe the
photo has a picture of a credit card or something.2. Submit the
hash to the backend. Given that hash is not used on the platform
past a significant amount of views, automatically ban the photo,
and allow other users to petition for reinstatement of the photo,
knowing that a non-valid petition for reinstatement (the user
doesn't have rights to the photo) COULD result in a strike also.3.
Have humans review the petitions, and in cases where the users
explicitly allow, the images.Doing this would limit another human
seeing the intimate photo in well over 90% of the cases I'd bet. In
addition to that, even if the photo is shared with another human,
it would let the end user decide if they'd like a Facebook employee
to view the image too. IE, basic user privacy. Come on Facebook.
Dumb fucks.[0] very doable, https://trackingjs.com/
leggomylibro - 2 hours ago
I really don't understand why anyone uses Facebook.You have a
phone, which can communicate with your friends. It's only
remotely useful for its near-monopoly on event planning, and imo
that's a job for the trustbusters.
praneshp - 1 hours ago
> I really don't understand why anyone uses Facebook.You really
cannot? Like, can you not imagine one person, who doesn't
realize (or worse, and more likely IMO, does not care) about
the privacy implications?It's not hard to understand at all, I
think.
spinco - 36 minutes ago
Compared to a phone (SMS) and other messaging apps (whatsapp,
email, signal), fb messenger has two big benefits for me:*
Cross platform: I don't have to type on a tiny screen when I'm
in front of a computer, and won't lose my account or my
messages if I lose my phone or phone number.* Discoverability:
most people who I meet are on it, so I can reach out to people
I meet at events as well as friends of friends.I don't love
messenger. I use Messenger Lite on my phone to avoid most of
the snapchat/gif/events/birthdays features. But I'm not aware
of any other cross-platform messaging app with that
discoverability, let alone one free of privacy issues.
nimblegorilla - 45 minutes ago
> This clearly implicates that Facebook has, almost without
reserve, the ability to read messages from user to user (even
though it might limited to messages to oneself), and exposes that
ability to its employees.Why do you seem surprised by this? I
can't fathom anything about Facebook's design that implies my
posted content is hidden from employees.
danbruc - 1 hours ago
This clearly implicates that Facebook has, almost without
reserve, the ability to read messages from user to user (even
though it might limited to messages to oneself), and exposes that
ability to its employees.You can view and search your message
history on Facebook, so there is not really any question that
they have and can read all messages. I never explicitly checked
it but at least I never noticed any gaps, i.e. missing messages
because I sent them using the Messenger app.I believe to remember
that Messenger is supposed to use Axolotl Ratchet for end-to-end
encryption but that is hard to reconcile with the availability of
your message history on facebook.com. So maybe it's - quote -
end-to-end - unquote - between the phone and the server?I never
really thought about that. Or is it just not available or
disabled in my Windows Phone version of the app?
xanderstrike - 16 minutes ago
You can download a complete copy of the data Facebook has on
you, including all messages you've sent since your account was
created [1]. I did this some years ago to graph changes in
sentiment with friends.1.
https://www.facebook.com/help/131112897028467
porfirium - 1 hours ago
Using Messenger you can start a secret conversation with
someone which is encrypted end-to-end. Those conversations only
exist in the current device. But regular conversations are not
encrypted in any way.
alexggordon - 1 hours ago
It's not that infeasible to have a web platform that encrypts
users messages, just each client gets an encryption key. Look
at Telegram, or even FB Messenger end to end encryption. I'm
not saying they ever said they didn't have the ability to do
it, I'm saying it's weird that they would use "message
snooping" as a form of user photo submission.
danbruc - 1 hours ago
When I send messages using the Messenger app - which is
supposed to use end-to-end encryption, isn't it? - I can see
and search those messages on facebook.com. Something does not
add up here. And if they have the keys on their servers, then
it's rather pointless to use end-to-end encryption in the
first place.
rrix2 - 1 hours ago
facebook only does end-to-end for "secret" conversations
started in their mobile app.[1]
https://www.facebook.com/help/messenger-
app/1084673321594605...
danbruc - 1 hours ago
That explains it and it seems unavailable in the Windows
Phone version. And because I never owned or used an
iPhone or Android device I was unaware of this
difference.
BadassFractal - 2 hours ago
Tricky. Sending them all of your potentially leaked embarrassing
photos so that they can store them and prevent them from leaking
out. If they get hacked now someone has a lifetime supply of
blackmail material. Not clear if the cure is better than the
ailment.
moyix - 2 hours ago
From the fine article:"We store the photo hash?not the photo?to
prevent someone from uploading the photo in the future. If
someone tries to upload the image to our platform, like all
photos on Facebook, it is run through a database of these hashes
and if it matches we do not allow it to be posted or shared."
BadassFractal - 2 hours ago
Ah, missed that, thanks. Yeah I guess you have to trust that
part. I guess I'm willing to believe that. Hopefully it's not
like with Snapchat where everything get stored behind the
scenes forever even though it "disappears".Also, wonder if
basing things on hash will make detecting slightly modified
versions of that content almost impossible. E.g. IG / YT can
detect if you upload content with a piece of music that's
copyrighted regardless of how much you alter it, but this would
be fairly primitive.
tonyarkles - 2 hours ago
Is there a way for me to audit that the image is never stored?
(No) It seems like that's a huge honeypot; someone compromises
the "upload your naughty photos here" endpoint and has an
endless supply of "non-consensual intimate images".Edit:
Thinking about it a bit, I'd be way more comfortable with the
idea of the image hashes being computed on-device and only the
hashes being sent to the server. This opens a different door of
abuse (by essentially permitting individuals to ban someone
from posting an image by uploading the hash of the image), but
it's definitely preferable to encouraging people to send all
their selfies to the privacy commissioner.
evgen - 1 hours ago
While this (client-side hashing) would be better in terms of
privacy protection for this specific case it is not going to
happen because Facebook is not allowed to perform the
specific photo hashing client-side as this would expose the
hashing mechanism to analysis.
the8472 - 18 minutes ago
maybe they shouldn't ask for nudes until they have designed
an open source algorithm?
dmitrygr - 35 minutes ago
> this would expose the hashing mechanism to analysis
Security through obscurity. Works every time (tm)
BinaryIdiot - 2 hours ago
But you are sending an image to yourself in Messenger which
DOES normally keep an image. It doesn't say here that they
explicitly delete the message you sent via Messenger, does it?
Perhaps you have to delete that for yourself? And do they take
back-ups of Messenger data in which this photo may now be
in?Seems like a very awkward flow to forcefully repurpose
Messenger to do something it shouldn't be doing.
giobox - 1 hours ago
All you can do is take FB at their word that the image is
destroyed irrevocably.Given what we know about data retention
policies at big tech firms, I'm not so sure I would feel
confident taking the press release at face value. I'd really
like to see a white paper or similar outlining the specifics of
how this is being handled. I'd also have questions around what
steps have been taken to prevent rogue Facebook employees
trying to obtain the images.I really wish they had found a way
to generate the hashes they require client side and not receive
the image at all, especially as this is something that
presumably would be really great for revenge porn victims.
pault - 2 hours ago
This is insane. If they can fingerprint your images, why can't they
provide a tool for processing the image and sending the fingerprint
only? Can you imagine how big of a target a tailor made database of
blackmail material linked to facebook accounts would be?
DiThi - 2 hours ago
If you can send only the fingerprint, what stops you from
uploading photos you are not in?A better idea may be an uploading
tool where you can obscure some parts for verification.
sova - 2 hours ago
Seems like anybody can upload any body.
d0ugie - 1 hours ago
That sounds like a solution to me, it satisfies whatever the
need is for human involvement while enabling the user to redact
the weaponizable elements of the picture.
Dylan16807 - 2 hours ago
They could just send fingerprints, sure.The problem is that you
presumably want the image to be blocked immediately upon
upload.But facebook is worried that people will abuse the system,
and they don't want images to be instantly wrongfully blocked
until someone can review it.Abuse-resistance. Instant blocking.
Perfect privacy. Pick two. There's no perfect solution.
ImSkeptical - 1 hours ago
Why not suspend reported images for review. If the review
indicates the reporter is abusing the system, blacklist the
reporter from the automatic function. Also block new and recent
accounts, and stop automatic blocking for images on frequently
targeted accounts and major accounts (e.g. If someone is
reporting Donald Trump, manually check, don't automatically
remove).
Dylan16807 - 1 hours ago
Doesn't this system already exist? But reporting an image
means it wasn't preemptively blocked, the benefit of the new
system.
evgen - 1 hours ago
In addition to the reasons other have suggested, one reason that
this is not done client side is that the PhotoDNA tech that is
used for the fingerprinting is not something that Facebook can
share or make available. It's primary use is in matching posted
images with known databases of child porn images, so providing a
mechanism for someone to reverse engineer the system and develop
a means of bypassing it is not going to happen.
oh_sigh - 24 minutes ago
> Once we receive this notification, a specially trained
representative from our Community Operations team reviews and
hashes the image, which creates a human-unreadable, numerical
fingerprint of it.I'm wondering how long it will be before we start
seeing cell phone pics of screens with peoples intimate images on
them.
intopieces - 58 minutes ago
It?s interesting that the general consensus is strangers seeing
your nudes is creepy (referring to the manual processing aspect.) I
understand it on a fundamentally emotional level: I want absolute
control of my private life / photos / etc. I am embarrassed at the
idea of being seen naked without my active participation. On a
logical level, though, what difference does it make if every person
you will never meet has seen your dick?It?s a philosophical
question, to be sure.Also, what happens with these manual reviews
that turn up underage nudes? Teen sexting is a thing. For that
matter, isn?t Snapchat a massive repository of child pornography?
cowpig - 22 minutes ago
What's really bizarre to me is that people here don't think using
facebook in general is creepy...
beaconstudios - 2 hours ago
isn't nudity generally banned on facebook anyway? I didn't realise
this would/could be an issue. Unless they're talking about private
communications in eg. messenger.
sp332 - 2 hours ago
Revenge porn isn't all about nudity. And this does cover
Messenger (and Instagram). The point isn't to get an image taken
down after the fact, but to proactively block it from being
posted or sent in the first place.
beaconstudios - 2 hours ago
I thought facebook would run posted images through their AI
nudity filter before allowing them to be posted? It's good that
this covers messenger as I've read stories in the past of
people being blackmailed over it, but surely there must be a
better way than asking people to upload their private images to
facebook. Plus, what's the case with younger facebook members?
Do they upload the images and are therefore sharing child
pornography with facebook, or not upload and don't get to be
protected from revenge porn postings, when they're one of the
most at-risk demographics?
sp332 - 2 hours ago
I didn't know Facebook had an AI nudity filter, I thought it
was all based on reported images. They softened their stance
on some things, especially after censoring a Pulizter-prize
winning photo from Vietnam, and eventually changed their
policy on breastfeeding too
https://www.facebook.com/help/340974655932193/I'm sure they
don't have a legal exception for underage photos, but those
laws do vary from state to state.
karthikshan - 3 minutes ago
It's probably infeasible to run that type of filter in-path
during uploads without adding too much latency to posting,
wheras the hashing approach is much less expensive. AI
filters likely take down images after they've already been
posted
dmitrygr - 56 minutes ago
So they basically openly admit that a human will review nudes? How
is this in any way a sane idea? If you were already scarred (for
life, likely) by said revenge porn existing and leaking out
somewhere non-facebook, likely the last thing you want is to be
forced, YOURSELF, to send it to someone (facebook)
skybrian - 33 minutes ago
From the article: "people can already report if their intimate
images have been shared on our platform without their consent".If
it already happened, they don't need to upload it again.
Presumably this new process is for prevention when it didn't
happen yet but they have good reason to suspect that it will.
dmitrygr - 21 minutes ago
"if it already happened on facebook" != "if it already
happened"There is, in fact, a whole world outside of
www.thefacebook.com
marrone12 - 2 hours ago
Why do they need a human representative to hash the image? I don't
understand why this can't all be done automatically. Creeps me out
that a real human would see intimate pictures before they block
them.
leggomylibro - 2 hours ago
Because otherwise the platform will not be able to store or
display images at all.Report, report, report, report, report -
you know someone will do it, or make a bot to.Or they would need
to put actual effort into developing a novel solution to a new-
ish problem. But they're a large incumbent, so...no.
[deleted]
gadjo95 - 2 hours ago
Because otherwise people will just submit random picture for the
lulz
BinaryIdiot - 2 hours ago
I'm _assuming_ the only reason a human representative is doing
this is so the system doesn't get abused. Like say I upload a
photo claiming it is of myself but it's actually the
advertisement a competitor of mine is using. If they didn't
verify it was something that SHOULD be removed they'd
automatically remove my competitor's posting of their
advertisement every time they posted it.Granted I really wish
this was all done on the client so Facebook didn't gain access to
the images themselves but I'm not sure of a good way around it to
verify the image is something they should remove and not an abuse
of the system.
pjc50 - 2 hours ago
I'd assume that would be dealt with by an appeals process on
the other end; if you post an image and it's flagged, you
should be able to say "this is obviously not porn" and get it
posted and the original false claimer should lose trust points.
BinaryIdiot - 2 hours ago
Problem is it's significantly easier and faster to create
noise than it is to clean up the noise. Dropping the up front
review in favor of an appeals only system sounds impossible
to handle at Facebook scale, IMO. Every time you shut
someone's account down for losing too much trust, 10 others
would have already replaced it.
tantalor - 2 hours ago
they'd automatically remove my competitor's posting of their
advertisement every time they posted itSince that will be very
rare case why not make that the human-required step?
BinaryIdiot - 2 hours ago
Why would that be a "very rare" case? Facebook and Twitter
have (or have had) hundreds of thousands of bots and constant
abuses, why would this feature be rarely abused?
lasfter - 1 hours ago
Facebook: "Send Nudes"
Apreche - 2 hours ago
Ok, so now all the harassers out there are just going to edit a few
pixels on the images before uploading, so the hash isn't caught.
All the same tricks use don YouTube to avoid the copyright bot,
will work here as well.
baddox - 2 hours ago
I?m not saying that people won?t be able to fool it, but I?m sure
it would require way more than just changing a few pixels. I?ve
seen YouTube videos that were clearly edited to circumvent the
copyright bot, and the video transformations were so significant
that the videos were essentially unwatchable.
oh_sigh - 22 minutes ago
md5/sha isn't the only hashing method. There are perceptual
hashes which are fairly resilient to simply editing few pixels,
or rotating/resizing/cropping an image.
giobox - 1 hours ago
I have no knowledge of the implementation, but I think it's more
than likely a safe bet that someone at Facebook probably thought
of this and chose a slightly more sophisticated approach. The
entire system would be completely pointless otherwise!
BinaryIdiot - 2 hours ago
Yeah I'm curious how their hashing mechanism works. Like, is it a
straight up sha hash of the image or something a little more
sophisticated that wouldn't be fooled by minor edits? Though in
the latter case can that even technically be considered hashing?
I guess it all depends on how they're doing it which brings us
back to my first question...
evgen - 1 hours ago
It is called PhotoDNA and was developed by Microsoft. This is
the same tech that is used to find child porn images even when
people go to various lengths to prevent simple hashing from
being able to determine photo similarity. It works
surprisingly well at the task from what I have seen/heard.
FRex - 1 hours ago
Hash by strictest definition is just a function that maps any
input to fixed size output. There is a special subcategory of
hashes that are locality sensitive and change a little when
input changes a little, just like there is a special
subcategory of crypto hashes. See:
https://en.wikipedia.org/wiki/Locality-
sensitive_hashingTechnology for searching for similar images (I
have no idea how it works interally) already exists and is very
widely deployed. You can easily find an image similar
(uncropped, cropped, with different filter, greyed, ungreyed,
with a logo in corner, etc.) to your input image on
https://tineye.com/ and https://images.google.com .
CGamesPlay - 1 hours ago
This is likely the same image hashing they use to detect child
pornography, and there's been a lot of work invested into making
sure that those types of filters don't prevent the detection. You
can very effectively require that the "real" image is visually
entirely dissimilar.Now, obviously you can't prevent the actual
data from being distributed. For example, you could base64 encode
the image and send it as a series of messages instead.The
motivation for revenge porn is to hurt the other person. One
vector for this is to post it to all of your mutual friends so
that the victim is shamed. This adds a barrier so that the mutual
friends have to invest work into getting and distributing the
porn, and won't accidentally see it flowing through their feed.
orastor - 2 hours ago
To everyone asking why they need a human to review to image before
hashing it: it's currently the only way to prevent abuse of this
system -- Without this, everyone would upload random images that
would get taken down, effectively mass-trolling
Pulcinella - 1 hours ago
At most, they would only need a human to review after the hash
matches, not before.Facebooks plan is awful.
deusofnull - 1 hours ago
The discussion surrounding this made me think of a feature I'd love
to have along these lines. I'd like to be able to blacklist people
/ certain people from uploading pictures that facebook detects my
face in. Fb as a platform must already have this functionality,
what with the auto face identification and privacy / review post
settings.I can imagine images of people that aren't "non-consensual
intimate images" that they'd reasonably still like to be able to
block being posted.One use case I can imagine is doxxing
prevention. Say there is a national news story involving some
random person in some small town that exposes the persons face and
name and rough location. Some group of enraged internet denizens
start sharing the person's face and name all over the place as a
means to spread their pitchfork & torches mob against them.Seems
like a reasonable feature to me. Recently a professor in my state
made some remarks about white privilege and the doxxing of him was
so violent and pervasive that he and his family had to leave the
state for months, and he still gets threats and needs protection on
campus.
rhizome - 1 hours ago
Or they can block display of the photo until all faces approve.
macawfish - 2 hours ago
Something just doesn't add up here.
Xeoncross - 40 minutes ago
The better solution is to have the image hashed and NOT sent to
facebook. If the hash matches an existing image [A]. If the hash
isn't found [B].[A]: human can review the image they already
have[B]: Facebook waits until someone uploads an image that matches
and then reviews the image (as normal) but with a marker alerting
the problem.The benefit is that people NOT affected won't have to
upload lots of images of themselves to facebook personal for
"review" just to be sure.The problem is that should facebook update
it's hashing or find better ways to match images, having the
original image would allow them to transition. This point is moot
though since facebook claims not to save the image anyway.
pishpash - 24 minutes ago
You'll have to guarantee the authenticity of the client side
code, but yes.Here's what you do: on the client side embed image
into a semantic space (using NN or whatever), quantize, _then_
hash the representation. Afterwards you send the hash. If you had
to change client side code, you can just ask the user to redo the
process.
Xeoncross - 20 minutes ago
Yes, I was hoping they would not simply `sha1()` the image
data. I wouldn't worry much about trying to fake hashes since
we are already allowing them to provide whatever image as input
they want (hence the review).
turc1656 - 36 minutes ago
This is dumb. A hash? That's only going to deter the most
incompetent internet users when it comes to a file type that is
subject to alteration without losing it's information value
(pictures, video, music, etc.).All someone has to do is change a
single bit/pixel in the image and the hash will be different. No
one will notice that and it defeats the hash. Hell, you don't even
have to do that. You can just rotate the image, save it, then
rotate it back, and re-save it. The odds are that program that you
used will most certainly not write the image data in the same exact
way.Hashes would be better for files that cannot be altered without
breaking the contents. Although, these could be subject to
compression, which would create a new hash.Either way, I think they
need to use something similar to Google's image search that
actually examines the photo contents for similarity.
sveiss - 30 minutes ago
In this case, 'hash' doesn't necessarily mean 'a cryptographic
hash of the raw file contents'.There are perceptual hash
techniques which still match after changes to the file, and are
tuned in a similar manner to lossy compression algorithms to
prioritize comparing information relevant to human
perceptions.https://www.phash.org/ is one open-source example.
paulhodge - 30 minutes ago
It's not a technical article. I think it's safe to assume that 1)
The people implementing this aren't idiots, and 2) The "hash" is
probably referring to a "perceptual hash", where very-similar
images would show up as the same.
ecopoesis - 28 minutes ago
There are hashes that represent the content of photos, the
Microsoft?s PhotoDNA and Cloudinary?s pHash. I?m pretty sure
Facebook?s engineers are smart enough to that they can?t just run
photos through MD5 and have everything work.
oh_sigh - 23 minutes ago
It could be a perceptual hash. Who knows? The article doesn't go
into detail.