Mobile camera rolls seem to not take the existence of nudes into account, as anyone who ever stumbled across an odd penis while scrolling through a friend’s device can tell you.And as we saw during the 2014 Celebgate hack, photos stored online using services like i Cloud can be vulnerable to breaches.She teamed up with Chen, who she had met at an entrepreneurship program, and an Armenian developer named Edgar Khanzadian.Together they built Nude, which uses machine learning to scan your camera roll for nudes automatically.The app also has the makings of a sustainable business model: it will charge users a dollar a month for the service.Of course, the big platforms could go after this market themselves, if they wanted to.
For example, individuals who are dating might send each other nude pictures.
Unfortunately, this means that, unless the Legislature amends the statute, any minor involved in sexting continues to be at risk of felony charges.
However, as the ACLU has explained in a letter to prosecutors in Washington, prosecutors can and should exercise their discretion to avoid treating teen sexting incidents the same as adult exploitation of children.
(This only works for photos in the first release, so you’ll need to manually import any sensitive amateur films that may be on your camera roll.) When Nude finds what it believes to be nude photos, it moves them to a private, PIN-protected vault inside the app.
(Chiu said Nude would monitor your camera roll in the background; in my experience, it’s more reliable to simply open Nude, which triggers a scan.) After sending you a confirmation dialog, the app deletes any sensitive files that it finds — both from the camera roll and from i Cloud, if the photos are stored there as well.
But then they might have to acknowledge the rampant trading of nudes — something that, so far, they have been loath to do. “Under the surface,” Chen says, “we’re all human beings.” And human beings in 2017 are sending lots of naked photos.