Ministers to look at ban on TikTok and Facebook for children under 16
- A consultation is planned for next month on children’s access to social media
Children under 16 could be banned from social media platforms such as TikTok and Instagram under a fresh crackdown being considered by ministers.
Officials plan a new consultation as soon as next month on children’s access to social media and keeping them safe from harm.
Government sources stressed that while potential bans were part of a wider discussion, they were unlikely to be in final proposals. Measures like requiring social media giants to further beef up parental controls are more likely.
Downing Street yesterday confirmed that ministers were ‘looking more broadly at the issue of continuing to make sure that we keep our children safe online’.
The Online Safety Act was recently passed to increase social media’s responsibility towards children and protecting them, including age verification obligations.
Children under 16 could be banned from social media platforms such as TikTok and Instagram under a fresh crackdown being considered by ministers (Stock Image)
But last week social media giant Meta, which owns Facebook and Instagram, announced it will enable end-to-end encryption on one of its messaging platforms.
The National Centre for Missing and Exploited Children in America has warned this will allow communications to go dark in a ‘devastating blow’ to child protection.
Yesterday the National Crime Agency (NCA), known as ‘Britain’s FBI’, warned parents to ‘think very carefully’ about allowing children to use Meta’s platforms because it will become harder to keep children safe online. Also yesterday, Schools Minister Damian Hinds urged Meta to ‘rethink its decision’ amid fears other social media giants will follow suit.
He said: ‘It’s not about protecting people’s privacy… this is really a question about ability to intercept and to ultimately investigate, bring to justice people who are engaging in child abuse.’
NCA chief Graeme Biggar said: ‘We are in ongoing discussions with the Home Office about this. I think the balance needs to shift.’
TikTok blamed a technical problem and said it spotted the issue and told Ofcom, triggering the probe
The NCA estimates up to 830,000 adults in the UK pose a sex risk to children. Watchdog Ofcom suggests a fifth of children aged eight to 17 have an adult online profile.
Ofcom said it was investigating TikTok over accusations it gave it inaccurate information. TikTok blamed a technical problem and said it spotted the issue and told Ofcom, triggering the probe.
Ofcom asked TikTok, Snapchat owner Snap and live video streamer Twitch for information about how they complied with legal requirements to protect children from harmful videos.
It found that while all three platforms have measures to prevent children seeing the videos, they can still sometimes face harm while using these platforms. Ofcom said: ‘We asked TikTok about its parental control system, but have reason to believe the information it provided was inaccurate.’
TikTok, Twitch and Snap all require users to be 13 and over. But it has allegedly been easy to enter a false age. OnlyFans, a platform with explicit content, uses facial age estimation, ID and other systems to check users are adults.
A Government spokesman said: ‘We’re looking at empowering parents rather than cracking down on anything in particular and we have identified a gap in research, so we’ll look at what more research into this needs to be done.’
Source: Read Full Article