Google’s image crawling illustrates how tech companies can penalize the innocent

Google’s image crawling illustrates how tech companies can penalize the innocent

Google’s image crawling illustrates how tech companies can penalize the innocent

<span>Photo: Jeff Chiu / AP</span>“src =”–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTU3Ng–/ -~B/aD02MDA7dz0xMDAwO2FwcGlkPXl0YWNoeW9u/″ data-src=”–/YXBwaWQ9aGlnaGxhbmRlcjt3PTk2MDtoPTU3Ng – / https: //–~B/aD02MDA7dz0xMDAwO2FwcGlkPXl0YWNoeW9u/https: //></div>
<p><figcaption class=Photography: Jeff Chiu / AP

Here is a hypothetical scenario. You are the parent of a child, a child. His penis has swollen from an infection and it hurts. You will phone the GP’s office and eventually get in touch with the study nurse. The nurse suggests that you take a picture of the affected area and email it so she can see one of the doctors.

So you take your Samsung phone, take a couple of photos and send them. A short time later, the nurse telephones to say that the doctor has prescribed antibiotics for her, which you can pick up at the clinic’s pharmacy. You drive there, collect them and within hours the swelling starts to subside and your boyfriend is awakening. Panic over.

Two days later, you find a Google message on your phone. Your account was disabled due to “malicious content” which was “a serious violation of Google’s policies and may be illegal”. Click on the “more information” link and you will find a list of possible reasons, including “child sexual abuse and exploitation”. Suddenly, the penny goes down – Google thinks the photographs you sent constituted child abuse!

It doesn’t matter – there’s a form you can fill out explaining the circumstances and asking Google to overrule its decision. At that point you discover that you no longer have Gmail, but luckily you have an old email account that still works, so you use it. Now, however, you no longer have access to the agenda, the address book and all those work documents that you kept on Google Docs. Nor can you access any photos or videos you’ve ever taken with your phone, because they all reside on Google’s cloud servers, where your device has thoughtfully (and automatically) uploaded them.

Shortly after, you get the response from Google: the company will not restore your account. No explanation is given. Two days later, there is a knock on the door. There are two police officers outside, a male and a female. I am here because you are suspected of holding and transmitting illegal images.

The images were examined by a human, who decided they were innocent, as were the police. Yet Google has kept its decision

Nightmare, huh? But at least it’s hypothetical. Except it’s not: it’s an adaptation to the British context of what happened to “Mark”, a father in San Francisco, as vividly recounted recently in New York Times by the formidable tech journalist Kashmir Hill. And, as of this writing, Mark still hasn’t got his Google account back. Being the United States, of course, he has the ability to sue Google, just as he has the ability to dig his garden out of him with a teaspoon.

The background to this is that tech platforms, thankfully, have become much more assiduous in scanning their servers for images of child abuse. But due to the unimaginable number of images stored on these platforms, scanning and detection must be performed by machine learning systems, aided by other tools (such as cryptographic tagging of illegal images, which makes them instantly discoverable around the world. ).

This is all great. The problem with automated detection systems, however, is that they invariably generate a proportion of “false positives” – images that signal a warning but are actually harmless and legal. Often this is due to the fact that machines are terrible at understanding context, which, at the moment, only humans can do. While searching for her relationship, Hill saw the photos Mark had taken of his son. “The decision to report them was understandable,” she writes. “They are explicit photos of a baby’s genitals. But the context matters: they were taken by a parent worried about a sick child ”.

As a result, most platforms employ people to review problematic images in their contexts and determine if they warrant further action. The interesting thing about the San Francisco case is that the images were reviewed by a human, who decided they were innocent, as well as the police, who the pictures also referred to. Yet despite this, Google kept its decision to suspend his account and dismissed his appeal. They can do this because they own the platform and anyone who uses it has clicked on an agreement to accept its terms and conditions. In this respect, it is no different from Facebook / Meta, Apple, Amazon, Microsoft, Twitter, LinkedIn, Pinterest and the rest.

This arrangement works well as long as users are satisfied with the services and the way they are provided. But the moment a user decides that he has been mistreated or abused by the platform, he falls into a legal black hole. If you are an app developer who feels that you are being scammed by Apple’s 30% tax as a price to sell in that market, you have two choices: pay or shut up. Likewise, if you’ve sold profitably on Amazon’s Marketplace and suddenly find that the platform is now selling a cheaper comparable product under its own label, well … difficult. Sure, you can complain or appeal, but in the end the platform is judge, jury and executioner. Democracies would not tolerate it in any other area of ​​life. So why are technology platforms an exception? Isn’t it time they weren’t?

What I have read

An image too big?
There is an interesting critique by Ian Hesketh in the digital magazine Aeon about how Yuval Noah Harari and his colleagues squeeze human history into a story for all, titled What Big History Misses.

1-2-3, gone …
The Passing of Passwords is a cute obituary for digital identity guru David GW Birch’s password on his Substack.

A warning
Gary Marcus wrote an elegant critique of what’s wrong with Google’s new robot project on his Substack.

Leave a Reply

Your email address will not be published.