Despite platform’s limits on adult content, study finds it not only accessible but often suggested
TikTok has directed children’s accounts to pornographic content within a small number of clicks, according to a report by a campaign group.
Global Witness set up fake accounts using a 13-year-old’s birth date and turned on the video app’s “restricted mode”, which limits exposure to “sexually suggestive” content.
Researchers found TikTok suggested sexualised and explicit search terms to seven test accounts that were created on clean phones with no search history.
With this kind of thing there needs to be more help than that, because there isn’t social consensus (yet) around keeping kids away from personalised-algorithm-controlled services. That means there is intense pressure on parents to let their kids use tiktok, instagram and so on, and if they resist that pressure they’re forcing their child into isolation from the socialising that’s going on on these platforms.
There are two acceptable options I see, but both require societal change which probably can’t happen without top-down intervention:
(It may be that 1 is the only acceptable option but I haven’t seen anything robust on the harms so far).
“Forcing age responsible” (if I understand that…) onto parents is no use if they do the right thing and make sure their kid’s account is marked as under 18 but it then receives mature content anyway.
How about we don’t let people add to society until there is a “consensus” you can have kids when you get your concensus card.
What