Future Tech

How sharing photos of kids unintentionally helps paedophile platforms

Tan KW
Publish date: Sat, 24 Jul 2021, 01:55 PM
Tan KW
0 464,956
Future Tech

When parents share photos of their children on Instagram, they usually don't think about all the people that can potentially see the pictures.

Cybercrime experts say these include, above all, paedophile who secretly steal photos of children and re-upload them on dubious portals - where harmless everyday pictures are placed in a sexual context.

Especially during the summer holiday season, these dark networks are likely to be particularly happy about a lot of picture material from beaches and swimming pools.

"It is understandable that parents and grandparents want to post photos of their children and grandchildren - they want to share their joy. Unfortunately, however, the internet is the least suitable space for this," says Joachim Tuerk, who lobbies for protection of children in Germany.

There is ultimately no privacy and no control over what happens to the photos and videos once they've been spotted by someone in the pedo-crime scene, a vast web where people are always looking for new images, Tuerk explains.

The stolen photos are made available for purposes "we don't want to imagine." Tuerk says he would prefer to remain silent about those purposes that the children's photos in these forums serve.

Nevertheless, he explains: "Imagine that the pictures end up on paedophile websites, and strangers comment on them in great detail about how exactly they would like to inflict sexualised violence on your children. Then you would hope that no clues about your home have been stolen as well."

But it is not only the dubious sites in the dark corners of the internet that worry child rights activists.

On the video platform YouTube, for example, users can put everyday pictures of children in a sexual context by setting a sexualised compilation of videos in a playlist, according to internet watchdogs.

"By means of a combination of sexualising adjectives (sexy, cute, hot, horny) and inconspicuous terms relating to age, size or activities (young, small, gymnastics), such playlists were found via the YouTube search function," according to a report from German internet watchdog Jugendschutz.net.

Scenes with minors in swimwear or gymnastic bodysuits were combined with erotic adult videos. This makes it easier for paedophile to access such depictions and turns minors into victims of sexualisation.

As a safe countermeasure, the experts recommend configuring the default settings so that videos are not made public or indiscriminately distributed.

"It's a good idea, for example, to turn off the function allowing your own videos to be added to playlists of others."

Cybercrime specialist Thomas-Gabriel Ruediger, finds not only publicly shared pictures problematic, but also when parents spread "vulnerable information" about their children: Where do they regularly go out to eat? What does their flat look like? What pets do they have?

"Using this contextual information, in the worst case, children can also be identified by perpetrators and possibly also directly contacted," says the expert.

For Joachim Tuerk, the danger behind unsuspectingly disseminated images of children does not begin outside one's own social sphere, but often already in a much narrower circle.

"All studies say that sexualised violence against children is mostly carried out in the so-called close social circle. By family, relatives, friends. And I'm not talking about 'so-called' friends, as they are commonplace on Facebook, Instagram and co," says Tuerk, who rather recommends a digital picture gallery on the tablet at home or a self-made photo book instead of using chat groups.

"Photos are often considered an entry ticket or takeaway to access paedophile hangouts on the darknet, and they're available online with just a click of the mouse."

Ruediger is already thinking into the future and sees an additional source of danger in the constant improvement of smartphone technology.

The ever-improving resolution of images, for example, already ensures that biometric data such as fingerprints can be read. "In addition, facial recognition software is also constantly improving and there is also artificial ageing software, even for private users," says the expert.

A child's picture that is shared publicly today could therefore lead to the child being found "fully automatically" even when he or she is older.

"This can mean that the child is deprived of the possibility of developing its own digital identity, or even no digital identity at all, even at a very young age."

All of this is only the current state of the art, says Ruediger, whose prognosis for the future does not sound optimistic: "We don't even know yet all the things that may in future be extracted from existing images."

 - dpa

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment