Blog content

“Google closed my account for ‘sexual content’. But they still won’t tell me which one and I lost everything.”

Five years ago, after the death of a friend and band member, David Barberá decided to pay for a Google Drive cloud account. He wanted to keep music files so that his friend’s children could hear how his father played. “So I signed up for the Google Drive service,” he says. “It was the surest thing that came to me so that Javi’s music would not be lost, the children were very young then,” he adds.

Barberá, 46, a high school teacher, however, hadn’t foreseen a key detail: Google’s terms of service hide a guillotine that deactivates accounts when it detects prohibited content, including sexual material involving children or terrorism. “The only thing I can think of is that I downloaded something that I shouldn’t have downloaded, like movies downloaded in the days of eMule. Could it be child pornography or terrorism? It’s possible,” Barberá from Valencia told EL PAÍS in a long phone conversation.

Barberá did not know what had happened. He only edited songs by reading forums or articles in the press. This teacher describes a desperate experience of helplessness to try to understand what happened: in July, he needed some music files he had on old hard drives. First, to organize it, he started uploading everything to his Drive, which he still pays for every month to have 2 terabytes of space. Minutes into the process, Google deactivated his account with a message saying they had found “harmful content”.

He then started several complaint processes, answered emails from apparent Google employees asking for new details (Nahuel, Rocío, Laura), called every company phone he found, never being able to talk to a human, asked for help from a close reporter, and even finally managed to chat with an apparent Google employee, who asked for “patience.”

sexual content

From this whole process, he got only one concrete answer. It was a message sent to his wife’s email (which he had previously added as a secondary account), with this confusing text: “We believe your account contains sexual content that may violate Google’s Terms of Service and may also be prohibited by Google law,” it begins, then continues: “We have removed this content” and “If you continue to violate our policies, we may terminate your Google Account. This happened on August 26, and although it sounds like a warning, the account is still disabled.

“I’ve had everything there for the past 14 years, and for five years I’ve only had this there,” he says, referring to the fact that he doesn’t keep it on external drives. The loss of Google account does not only mean the disappearance of photos and videos. Barberá also missed his classes, a blog he kept and his YouTube account. Also the services he had contracted with his email, from Amazon to Netflix, via a German music application: “Now I have to renew it but how to explain that yes, it’s me, but it’s not me because pederasty or terrorism… They will love it”, ironically.

The newspaper New York Times published two similar cases in the United States in August. Google told the reporter that the “banned” images were photos of children’s genitals that two parents took to send to the pediatrician for a skin problem. When EL PAÍS asked the same, Google replied that they could not provide this information since it is a European user and that they were only going to share it with him. But Barberá still does not receive any details.

Google offered this newspaper “background” conversations with employees, which in the jargon means that the journalist cannot identify the interlocutors or quote their words verbatim. According to the company, insisting it was not referring to the case, mail “with sexual content” is only sent in cases of child abuse, not adult pornography. Why then this sentence which implies “do not start again”? Google didn’t specify, other than that it all depends on what was in that account. A Google employee asked if this newspaper would name the affected user, but did not say why he was interested in knowing.

EL PAÍS found three other similar cases in Barberá: two others with Google accounts and one with Microsoft. All cases are from 2022 and in only one case the account was returned to its owner for a while. But it was not for alleged child sex images, but for a problem with the password which was never clarified either.

The other three users surveyed by EL PAÍS are in the limbo of large companies, which are actually too small to manage more than a billion accounts.

a friend on google

Another victim, who asked not to appear with his name because his company might have Google as a client, turned to “a close friend” who works for the company in Spain. The friend does not work in a department related to content moderation, but he did some internal research on what happens in these cases. His response was less than optimistic: this is handled overseas and no idea if anyone will actually read the complaint. He gave her little hope.

As in the case of Barberá, this user had had his account deactivated after downloading 40 gigabytes of WhatsApp photos, videos and conversations that he had on his hard drive. The file upload was so remarkable that his company’s cybersecurity officials called him to ask what was going on. Google does not specify when or how it analyzes the accounts of its users. But both in the cases of New York Times in the US like these two, it happened when file movement was detected. In the Spanish cases, when there were massive downloads of data.

The third victim has handed over his case against Microsoft to lawyer Marta Pascual, who is preparing the trial. His client is desperate because she has lost data from her private life but also from her work: “Her master’s degree from IESE, taxes, birth photos of children and work databases. She is in pain,” says Pascual. She sees no other way out than to file a complaint. “The judge can say that he saw his right to privacy violated, even if I did not find any case law,” she adds.

Pascual’s client believes that the suspicious files come from WhatsApp groups, the content of which was kept and downloaded automatically. The three people involved have children and, although they do not remember the pictures for the pediatrician, they had the typical images of children in the bathtub, the bed or the swimming pool.

Microsoft also does not give details

Microsoft always gives less information than Google. It only sends a few statements about how it combats pederasty in its systems: “First, we fund research to better understand how criminals abuse technology. Second, we are developing technology like PhotoDNA to detect cases of child sexual exploitation. Third, our team of agents promptly investigates reports of non-compliant content and removes it. And fourth, we work with other technology companies and law enforcement to report crimes.

Like Microsoft, in a conversation this newspaper had with Google, the confidence in its detection systems is remarkable. Indeed, its software has been refined or finds more and more false positives: between July and December 2021, it suspended 140,868 accounts, almost double compared to the first half of 2020.

Google scans accounts for child sexual material with two technologies: images that are already known have a numerical code that identifies them. If its systems find images that match these codes, it deactivates. This is the PhotoDNA system cited by Microsoft.

The problem is the new ones. For those, Google has created a second computer vision system that interprets the images and assigns them a probability that they are pedophiles. Then, in theory, they go to human reviewers who decide if a photo crosses the sexual threshold. The company is now concerned about material created by young people in sexual exploration without further ado, which can be taken out of this context.

Google has also spoken with pediatricians, for example, so that the computer can tell when a teenager’s body is already an adult. The pretense of objectivity for a worthy purpose can also bring down many innocent victims.

the thin red line

The same thin red line applies to typical photos of children in swimming pools or innocent surroundings. Google fears that, taken out of context or retouched, these photos will end up in files shared between pedophiles. Google will focus on so-called egregious cases, but this newspaper ran an article about teenage athletes and girls whose bodies were used on YouTube for dubious purposes and the videos remain open.

The users affected by these suspensions could one day receive another call: the Police. “I have a friend who is in the national police and I called him to tell him about the case and he told me that he would examine it with computer crimes”, explains Barberá, the Valencian professor. “They told him they didn’t know of any cases like mine.” However, it is likely that a case has reached Spain thanks to the efforts of large companies. Google or Microsoft must report suspicious findings to the National Center for Missing and Exploited Children (NCMEC) in the United States. The center is the one that informs the national police.

NCMEC sent 33,136 reports to Spain in 2021. Law enforcement sources confirm that this is the usual process and that they may receive reports with one or a few images. These are generally cases that are not investigated. In any case, the Police do not report to Google or Microsoft that this person is not a suspect. Companies make their own decisions and depend on the victim’s ability to justify the presence of the detected material. For this, however, they must report what this material is, which does not always happen. It is likely that if, in your opinion, the files found are of extreme seriousness or in large numbers, there is no longer any possibility of recourse.

Rubén Losada is a journalist who found himself without an account due to a problem entering the password. For some reason, he explains, Google thought it wasn’t him. Losada was in Tenerife on a trip and urgently needed a bus. For some reason Google asked for his password. He was wrong several times and wanted to introduce a new one. So he was stuck. Losada thinks the change of residence and his mistakes brought him down.

Although the beginning of his case was different from the rest, the defenselessness and the wall were similar. Like the others, Losada has paid his bill and still hasn’t found an interlocutor. He considered going to court or whatever. There was no way he was losing his account, he said. Every two or three weeks he asked again. But after six months, he was able to access it again, without knowing why: “An acquaintance who is a security analyst told me that sometimes these systems are programmed like that and that after six months they restart” , he says.

you can follow COUNTRY TECHNOLOGY in Facebook Yes Twitter or sign up here to receive our weekly newsletter.