Tech Giants Face Downing Street Grilling Over Child Safety Online

April 13, 2026 · Dayn Calham

Social media executives from Meta, Snap, YouTube, TikTok and X are being summoned to Downing Street on Thursday for a high-stakes meeting with Prime Minister Sir Keir Starmer and Technology Secretary Liz Kendall over online safety for children. The tech bosses will be questioned about what measures they are taking to protect young users and respond to parent worries, as the government continues its review on whether to implement a complete prohibition on social media for under-16s, following Australia’s lead. Sir Keir has emphasised that the meeting will centre on ensuring “social media companies accept and demonstrate responsibility”, warning that “the consequences of failing to act are stark” and that the government has a duty to parents and the next generation to put children’s safety first.

The Number 10 Showdown

Thursday’s meeting constitutes a pivotal moment in the government’s push to bring tech giants to account for their role in safeguarding vulnerable young users. The meeting comes at a pivotal juncture, with Parliament having dismissed calls for an complete ban on social media for those under 16 just hours earlier, despite backing from the House of Lords. Instead of introducing a broad prohibition, MPs voted to give ministers authority to establish their own restrictions, signalling the government’s preference for a increasingly tailored regulatory approach rather than a sweeping legislative ban.

The scheduling of the Downing Street summit demonstrates the government’s commitment to seem decisive on online safety whilst navigating multifaceted political and commercial pressures. Professor Gina Neff from the University of Cambridge’s Minderby Centre for Technology and Democracy indicated the meeting permits the administration to demonstrate it is taking the initiative on internet harms. Downing Street has already acknowledged that some platforms have progressed, deploying measures such as deactivating autoplay for children by preset, and giving parents greater controls over device usage, though commentators contend considerably more must be done.

  • Tech executives questioned on child safety protections and parental concern responses
  • Ministers considering prohibition of social media for under-16s drawing from the Australian approach
  • MPs rejected full ban but gave ministers powers to implement controls
  • Some companies already introduced protections like turning off autoplay for younger users

Parliamentary Rejection and the Broader Debate

Wednesday evening’s House vote dealt a significant blow to campaigners advocating for a comprehensive social media ban for under-16s, representing the second time MPs have dismissed such proposals despite considerable backing from the House of Lords. The government’s decision to favour ministerial discretion over legislative action reflects a more conservative strategy, with ministers arguing that an outright ban would be premature given continuing policy discussions. This approach provides the administration flexibility in designing tailored controls rather than implementing a blanket prohibition that some worry could prove difficult to enforce and effectively oversee across various platforms.

The rejection has heightened debate about whether the UK is sufficiently safeguarding its youth from internet-based threats. Whilst the administration argues that giving ministers authority to establish customised regulations represents a more sensible solution, critics assert this approach misses the decisive intervention the situation necessitates. Recent research from Australia, where an ban on social media for under-16s was implemented in December 2025, reveals that over 60 per cent of minors persist in using platforms even so, raising serious questions about the efficacy of legal prohibitions and suggesting the challenge stretches well past basic restrictions.

Multi-Party Criticism

The parliamentary vote has provoked sharp criticism from opposition benches. Conservative shadow education secretary Laura Trott charged Labour MPs of failing parents and children by rejecting the ban, contending that other nations are recognising social media’s dangers whilst the UK drops back under the current government. Liberal Democrat education spokeswoman Munira Wilson echoed these reservations, stating that “the time for incremental steps is over” and insisting on immediate intervention to restrict the most damaging platforms for young users rather than gradual policy tweaks.

Australia’s Cautionary Tale

Australia’s experience with social media restrictions offers a cautionary case study for policymakers considering similar measures in the UK. When the country introduced a prohibition on online platforms for those under 16 in December 2025, it was celebrated as a landmark step in safeguarding young people from digital risks. However, emerging research from the Molly Rose Foundation has uncovered a concerning picture: more than 60 per cent of underage Australians keep using social media platforms in spite of the legal ban. This substantial rate of non-compliance indicates that legislative bans alone may prove inadequate in stopping determined young users from accessing the services they want to access.

The Australian research carry considerable implications for the UK’s ongoing policy discussions. If a similar ban were implemented in Britain, the evidence indicates enforcement would pose formidable challenges, with young people probably finding ways to bypass age-verification systems and restrictions through various technical means. The data undermines arguments that a simple legislative prohibition represents a silver-bullet solution to online safety concerns, instead highlighting the need for a more comprehensive approach combining regulatory measures, platform accountability, parental oversight tools, and digital literacy education to meaningfully address the risks young people face online.

Key Finding Implication
Over 60% of underage Australians still access social media despite ban Legislative prohibitions alone cannot effectively prevent determined young users from accessing platforms
Ban introduced in December 2025 has failed to achieve widespread compliance Enforcement mechanisms remain weak and young people find workarounds to restrictions
Blanket bans do not address underlying appeal of social media to young people Multi-faceted approach combining regulation, platform accountability, and education is necessary

Subject Matter Experts Push for Substantive Measures

Child safety advocates and online protection specialists have stepped up demands for tech companies to implement meaningful action beyond voluntary measures. The Molly Rose Foundation, established in memory of 14-year-old Molly Russell who died by suicide after viewing harmful content online, has been especially outspoken in calling for structural reform. Rather than implementing sweeping prohibitions that prove hard to police, campaigners argue the priority should move towards holding platforms accountable for the systems driving dangerous material to vulnerable users.

Andy Burrows, head of the Molly Rose Foundation, has emphasised that Thursday’s Downing Street meeting constitutes a pivotal juncture for government action. The charity has repeatedly maintained that social media companies possess the technical capability to introduce robust safeguards, yet often prioritise user engagement figures over the welfare of users. Experts emphasise that genuine protection requires platforms to redesign their algorithmic recommendations, improve moderation practices, and provide parents with practical resources to monitor their kids’ internet use successfully.

The Algorithm Problem

At the heart of concerns lies the algorithmic systems that control what content young users see. These algorithms are engineered to maximise engagement, often promoting sensational, harmful, or addictive content to at-risk groups. Reforming these systems constitutes one of the most pressing challenges in online safety, requiring transparency from platforms about how their recommendation engines operate and what safeguards exist.

  • Algorithms emphasise engagement over the safety and wellbeing of users
  • Platforms should enhance openness regarding how content is recommended
  • Third-party audits of algorithmic damage are essential for accountability

The Next Steps

Thursday’s summit at Downing Street will establish the tone for the government’s approach to online child safety in the period ahead. Following the meeting, Sir Keir Starmer and Liz Kendall are anticipated to outline their results and determine whether established voluntary arrangements from tech companies are adequate or whether stronger legislative action becomes necessary. The government remains partway through its consultation process on whether to introduce an Australia-style ban on social media for under-16s, with the result of these discussions likely to influence the final policy direction.

Ministers have signalled their preference for conferring powers to place limitations rather than introducing a complete prohibition, citing anxieties over practical implementation and results. However, mounting pressure from opposition parties, child safety advocates, and parents suggests the government may face continued demands for firmer measures. The coming weeks will be crucial in determining whether digital platforms can demonstrate genuine commitment to safeguarding young people or whether Parliament will enact legislation to force compliance with stricter safety standards.