To comply with FTC regulations, all links could lead to commissions paid to the publisher. Please see Advertising Disclosure in sidebar.

SAN FRANCISCO, CA – A San Francisco man claims that his life was “ruined” after Google’s AI flagged a series of photographs that he had been taken to document his toddler son’s illness for his doctor during the COVID-19 pandemic – when remote contact with medical professionals was commonplace – as child sexual abuse, with a subsequent investigation following after the internet giant reported him to authorities.
The New York Times reported on the incident involving a man in his 40’s only identified as “Mark,” who was reportedly investigated by police after he had taken a series of pictures of his son’s genitals in February 2021 in order to keep track of the progression of a serious-looking infection the child had developed.
In the midst of the pandemic, virtual doctor’s consultations often supplemented in-person office visits for safety reasons, and at the recommendation of his doctor’s nurse, Mark obliged and uploaded a series of photos of his son’s infection to share with his doctor via Google’s cloud service. The doctor remotely diagnosed the infection based on the photos and prescribed antibiotics that cleared up the condition; at that point, Mark though the matter was over and done with.
However, due to the nature of the pictures – and the fact that Mark’s hand could be seen in one of them – the artificial intelligence governing Google’s cloud flagged the content and disabled Mark’s account for “harmful content” that could be related to “a severe violation of Google’s policies and might be illegal.”
FREE DIGITAL SUBSCRIPTION: GET ONLY 'FEATURED' STORIES BY EMAIL
Big Tech is using a content filtering system for online censorship. Watch our short video about NewsGuard to learn how they control the narrative for the Lamestream Media and help keep you in the dark. NewsGuard works with Big-Tech to make it harder for you to find certain content they feel is 'missing context' or stories their editors deem "not in your best interest" - regardless of whether they are true and/or factually accurate. They also work with payment processors and ad-networks to cut off revenue streams to publications they rate poorly by their same bias standards. This should be criminal in America. You can bypass this third-world nonsense by signing up for featured stories by email and get the good stuff delivered right to your inbox.
Google’s AI is programmed to scan for content that could be considered child sexual abuse material (CSAM); in the instance that such potential content is allegedly detected, a human content moderator would then review the offending material in question, determine if it violates the law, and then – as required – refer the matter to the authorities.
“I knew that these companies were watching and that privacy is not what we would hope it to be,” Mark said. “But I haven’t done anything wrong.”
Mark, who has worked in the Information Technology industry, suddenly found himself locked out of all of his Google-related services – including his emails, contacts, documents, and phone – bringing his professional life to a screeching halt. He attempted in vain to appeal the decision, but Google wouldn’t budge. Shortly afterwards, the San Francisco Police Department began an investigation into him.
After several months Mark was cleared of all wrongdoing, but by that point law enforcement had combed through his internet searches, location history, messages, and every photo and video he had digitally stored on Google’s cloud. But despite being exonerated, Google not only refused to reinstate Mark’s account, but informed him that it will be permanently deleted; Mark considered legal action, but decided the price tag was not worth it.
Mark’s photos were not considered illegal or exploitative by the police; nonetheless, Google’s AI – which Jon Callas of the Electronic Frontier Foundation said is “intrusive” and lacks the refinement to discern the CSAM from harmless content – which not only put a massive dent in Mark’s professional life, but nearly cost him the custody of his son.
“A family photo album on someone’s personal device should be a private sphere,” Callas said. “This is precisely the nightmare that we are all concerned about. They’re going to scan my family album, and then I’m going to get into trouble.”
Get great news content like this for your business website. Search engines love sites with frequently updated quality content and reward them with better search rankings. Get High Quality Content Updates for your site.