Google has suspended all Google accounts of a father who took a nude photo of his son for medical reasons and he is facing a police investigation.

The use of artificial intelligence to detect illegal content on the Internet is a common policy of large technology companies such as Google.. However, this can create problems, as was recently demonstrated after cataloged some photos taken with an Android phone as child sexual abuse material. and began filing a complaint with the authorities.

However, this was a mistake because, as the person who took the photo explained, it was a picture of her son’s perineum that was going to be sent to the doctor so that he can determine and prescribe treatment for the infection that the minor had in that area.

In addition to all the complications created by this situation, G.oogle also suspended all of the person’s accounts without giving them any chance to explain the context of the image..

This situation has reignited the debate about the access to users’ personal information that large companies like Google can have, despite the fact that they claim their systems are minimally invasive.

In accordance with The newspaper “New York Times, The incident happened in February 2021at the height of the pandemic, when many medical offices were not visited due to restrictions.

Mark, who was identified as a man to preserve his identity, said that when he saw that his son had a swelling in his perineum, he went to his doctor and At the nurse’s request, he took a photo before the video consultation.

Two days later, Mark received a notice from Google stating that all of his Google accounts had been suspended for alleged “harmful content” which is “a serious violation of Google policy and may be illegal.”

This meant that the man lost access to his contacts, email, photos, and even his phone number.. Also in December, it was reported thate The San Francisco Police Department opened an investigation against him, which later established his innocence..

Faced with this whole situation, Google responded that the company only scans its users’ images if they choose to back up their phones’ memory.

In these cases, they can access the material and run it through their AI filters.

Child Sexual Abuse Material (CSAM) is disgusting and we are committed to preventing it from being shared on our platforms.This is stated in a statement by Google spokeswoman Christa Muldoon.

Author: Julian Castillo
Source: La Opinion

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest

Relationships between 4 zodiac signs will be tense due to the full moon in Aries

An astrologer predicted that the impact of September's Full Harvest Moon In personal relationships, 4 zodiac signs will be cruel.. This lunation that will happen...

The expert assessed the restrictions on gas purchases from Wintershall Dea and OMV

Moscow has given the West a definite answer to the introduction of a ceiling on gas prices. Vladimir Putin's decree refers to measures...