Artificial Intelligence May Simply ‘Normalize’ Child Sexual Abuse as Graphic Photographs Emerge Online: Experts

The National Crime Agency (NCA), which is Britain’s leading company fighting organised crime, warned this week that the proliferation of private photographs of young people generated by machines has a “radicalising” effect that “normalises” paedophilia and disruptive behaviour towards young people.

“We that seeing such photographs, whether genuine or AI-generated, particularly increases the threat of offenders sexually abusing young people,” NCA chief executive Graeme Biggar said in a recent report.

AI ‘DEEPFAKES’ OF INNOCENT IMAGES FUELS FUEL POINT IN SEXTORTION SCAMS, FBI WARNS

National Crime Agency chief executive Graeme Biggar at a meeting of the Police Board of Northern Ireland at James House, Belfast, on June 1, 2023. (Photo by Liam McBurney/PA Images Getty Images)

The firm estimates that there are up to 830,000 adults, or 1. 6% of the UK’s adult population, who pose some sort of sexual danger to children. The estimated figure is 10 times higher than the UK’s criminal population, according to Biggar.

Most cases of child sexual abuse involve viewing specific photographs, according to Biggar, and with the help of AI, creating and viewing sexual photographs can “normalize” child abuse in the real world.

ARTIFICIAL INTELLIGENCE CAN DETECT ‘SEXTORTION’ BEFORE IT ARRIVES AND HELP FBI: EXPERT

“[The estimated figures] reflect partly an increased risk that has traditionally been underestimated, and partly a genuine construct caused by the radicalising effect of the internet, where the widespread availability of videos and photographs of abused and raped children, and teams sharing and discussing photographs, has normalised such behaviour,” Biggar said.

Renderings of artificial intelligence are seen on a computer with books in the background in this rendering photo from July 18, 2023. (Photo via Jaap Arriens/NurPhoto Getty Images)

In the United States, a similar explosion of AI is taking place to create sexual photographs of young people.

“Images of children, which add content from known victims, are being repurposed for this really diabolical release,” Rebecca Portnoff, director of knowledge science at Thorn, a nonprofit that works to protect children, told The Washington Post last month.

CANADIAN SENTENCED TO PRISON FOR AI-GENERATED CHILD PORNOGRAPHY: REPORT

“The identity of the victim is already a needle in a haystack problem, where police are looking for a child in distress,” he said. “The ease of use of those equipment is a significant change, as well as realism. It just makes it harder. “

Popular AI sites that can create photographs based on undeniable activations have network rules that prevent the creation of disturbing photos.

Teenager in a room. (Fake images)

These platforms rely on millions of photographs from the internet that serve as building blocks for AI to create compelling representations of other people or places that don’t exist.

LAWYERS BRACE FOR AI’S POSSIBILITY OF QUASHING COURT CASES WITH FALSE EVIDENCE

Midjourney, for example, requires PG-13 content that avoids “nudity, sexual organs, fixation with bare breasts, other people in displays or bathrooms, sexual photographs, fetishes. “While DALL-E, OpenAI’s imaging platform, only allows G-rated content, banning photographs that show “nudity, sexual acts, sexual services, or content in a different way intended to arouse sexual arousal. “s to create disturbing photographs, according to various reports on AI and sex crimes.

Police with 911 sign. (Fake images)

Biggar noted that AI-generated photographs of young people also throw police and law enforcement into a maze of deciphering fake photographs of real patients who need help.

“Using AI for child sexual abuse will allow us to identify real children who need coverage and further normalise abuse,” the NCA chief executive said.

AI-generated photographs can also be used in sextortion scams, and the FBI issued a crime warning last month.

Deep fakes sometimes involve editing videos or images of other people to make them look like someone else employing deep learning artificial intelligence and have been used to harass patients or raise money, including children.

FBI WARNS AI FAKES ARE BEING USED TO CREATE “SEXTORTION” SHOTS

“Malicious actors use content manipulation technologies and facilities to exploit images and videos, captured from an individual’s social media account, from the web opened or requested by the victim, into sexually themed photographs that appear realistic to a victim, and then disseminate them on social media, public forums or pornographic websites,” the FBI said in June.

CLICK HERE FOR THE FOX NEWS APP

“Many victims, including children, do not know that their photographs have been copied, manipulated and disseminated until they are brought to their attention through someone else. “

Leave a Comment

Your email address will not be published. Required fields are marked *