Gaze on this erotic image.
The current state of computer detection of pornography is a bit primitive: it keeps mistaking desert photos for images of naked people. If I stare hard at it for a while, I guess I can sort of see it — it’s all those reclining curves, I think.
From this we learn that AI is not only unable to distinguish people from bags of sand, but also that it’s more than a little racist.
militantagnostic says
I have emails from my( Pressure Transient Analysis) business flagged as porn at the receiving end because you can’t spell analysis without anal.
bcwebb says
Wow, that scene looks really hot!
Oggie. My Favourite Colour is MediOchre says
Hey! It’s my favourite colour!
militantagnostic says
bcwebb @2
and gritty
hunter says
I was at a new (for me) coffee shop the other day doing some surfing and discovered that their filter was banning Hullabaloo,a respected liberal political blog, as “porn”.
I guess to some people, it is.
nomadiq says
Not to say that AI technology is never racist but this is not an example of AI technology being racist, necessarily.
Detecting sand color and concluding it’s porn is not an example of the AI thinking only sand (white?) colored people do pornography. That doesn’t logically follow at all. If the AI only failed on “desert sand” color you might conclude the AI was taugh about a particular racial porn – and this would be racist. But failing to see the difference between sand and porn is not racist if amongst the pornographic images the AI was trained on included sand colored people.
To conclude otherwise is letting your biases get the better of you or really not understanding how AI technology works. Neither will improve AI technology or establish accurately where AI fails and where we need to be cautious with it.
Snarki, child of Loki says
See, if you had a GIF of the Grand Teton Mountains, it would have been totally okay, right?
methuseus says
I remember when the censoring software on the Internet at my high school wouldn’t let us get to Webcrawler. Never did figure out what it was classified as, since I don’t remember if it told us.
KG says
militantagnostic@1,
The inhabitants and businesses of Scunthorpe, Lincolnshire, have similar problems.
KG says
Hah! I tried to post a response to #1, pointing out that the inhabitants and businesses of S****thorpe in Lincolnshire had similar problems, and FTB’s filter rejected it! Why beholdest thou the mote that is in thy brother’s eye, but considerest not the beam that is in thine own eye, PZ?
KG says
Sorry, one too many asterisks in the forbidden town name @9.
davidc1 says
@10 Who put the bop in the bop de bop de bop ?
Who put the ,,,,,,well you know the rest .
Kagehi says
Yeah. If it was just mistaking deserts… But it flags everything from potted plants to artistic works, whether the latter “actually” violates the rules or not. And given the fact that the latest BS censorship AI was only put in place to sanitize a tumblr, which is and artist haven (one of a tiny few), so that it could be sold off to a big corporation, all of whom are, apparently, beholden to the moral minority and scared to death someone might right letters.
But, the unbelievably stupid thing about it, other than the fact that there seemed to be no interest, previously, on their part to actively pursue detection of.. say, racists, or worse, but half the freaking people using it are artists, who don’t really have any other similar place to post at the moment, and make up a significant part of their user base.
The minimum being suggested at this point is, one the day of stupidity, log out, and stay logged out, at least 24 hours. The, likely, more effective, but also damaging to all those artists who might still hold on by the skin of their teeth, because the bloody AI hasn’t accidentally flagged them as posting adult pictures, is, “Leave them without the user base they are trying to sell out, and claim represent the value of the service.”, i.e., delete your account with them, before they find some AI driven excuse to delete it themselves.
I mean, seriously, one of the posts “recently” flagged by this absurd system was, if I remember, a photo of starving kids, as part of one of those “feed the hungry” things, where it “decided” there was too much skin showing (not white, thankfully, just too much skin), and thus “had to be” porn. How many years of this nonsense have we gone through, which places like search engines accidentally flagging everything from women’s health sites, to foreign aid initiatives, to, heck, another one of the recent one was a bloody “box” with a super hero on it (must have been flagged as body painting or something???), and they still haven’t grasped the concept, “This shit doesn’t work!”
Rich Woods says
I wonder what the software makes of the view from Mexico City?
Ragutis says
larpar says
Because I’m not sure and correct me if I am wrong but I think I see a …. Nipple.
whywhywhy says
Do you think the AI’s kink is related to silicon being a key component of both computer chips and sand? Kind of getting back to the womb type of thing or not wanting to see your parents naked?
richardelguru says
@9 & 10
https://en.wikipedia.org/wiki/Scunthorpe_problem
ardipithecus says
@ 16
And some breast implants. The connection is obvious.
kudayta says
It took me about 5 seconds to see that image as a naked person. There’s a dead pixel on my monitor, and it was right in the top-middle part of the darker sand dune. Made it appear as a bellybutton.
I should probably see a therapist.
Lofty says
Ooooh, early morning sand dunes make me …. moist.
laurian says
If AI is this bad at nekkid pictures I shudder at the nascent MGTOW sexbot industry.
weylguy says
That image is disgusting and should be taken down immediately! Oh, I know that some SCOTUS justice once said that pornography is in the eyes of the beholder, but this blatant sickness has just got to stop!
Shame on you, Myers! I thought your photos of buck-naked spiders were over the top, but this is really too much!
brucej says
Our new antivirus/web blocking software has, for some reason, blocked my dog’s vet practice as a ‘Adult/Sexual site’. I have no idea why…
brucej says
@22 Here yah go http://slurmed.com/fgrabs/01acv09/01acv09_027.jpg
anthrosciguy says
I guess the 40-year old virgin was on to something.
chrislawson says
How about a little NSFW warning before posting that filth?
NelC says
As one wag on my tumblr feed said, “Send dunes”.
madtom1999 says
@1 wouldnt that make your handwriting a bit wobbly to say the least?
madtom1999 says
@13 this shit does work. In the sense that it can pick out stuff with close to human performance with significant speed increases and massive cost reductions. Humans make mistakes too – and they can also get seriously fucked up from looking at some images so I quite like the idea of machines doing the first parsing of the job. Over time, if there is a reasonable error handling process and people report back problems with images like this so the AI model can be reassessed and re-trained to further reduce things being rejected that we find silly.
They are still shit – but putting humans in charge of censorship would mean potential to damage them. And humans are already bigoted and mean and nasty and can cause as much damage as an AI with ease.
davem says
Someone’s been feeding the A.I. program with The English Patient . In the film, there’s lots of fades from sleeping females to desert landscapes with the same contours.