Have your say on government e-petitions

Make verified ID a requirement for opening a social media account.

Submitted by Katie Price on Friday 19th February 2021

Published on Friday 5th March 2021

Current status: Closed

Closed: Sunday 5th September 2021

Signatures: 696,980

Relevant Departments

Tagged with

abuse ~ Crime ~ Criminal ~ Disabled ~ Ensure ~ Experienced ~ Guardian ~ Harvey ~ hide ~ Hiding ~ Law ~ Media ~ Parent ~ Police ~ Prevent ~ Reported ~ Social media ~ The Police

Petition Action

Make verified ID a requirement for opening a social media account.

Petition Details

Make it a legal requirement when opening a new social media account, to provide a verified form of ID. Where the account belongs to a person under the age of 18 verify the account with the ID of a parent/guardian, to prevent anonymised harmful activity, providing traceability if an offence occurs.

Additional Information

My son Harvey is disabled. He is also the kind and gentle son of a person regularly in the public eye. The Online Harms Bill doesn’t go far enough in making online abuse a specific criminal offence and doing what ‘Harvey’s Law’ intended. To make the law work needs the removal of anonymity to ensure that users cannot cause harm by using online platforms to abuse others. Where an offence has taken place they ought to be easily identified and reported to the police and punished. We have experienced the worst kind of abuse towards my disabled son and want to make sure that no one can hide behind their crime.


You can't sign this petition because it is now closed. But you can still comment on it here at Repetition.me!

Government Response

The Government responded to this petition on Wednesday 5th May 2021

The Online Safety legislation will address anonymous harmful activity. User ID verification for social media could disproportionately impact vulnerable users and interfere with freedom of expression.

The government recognises concerns linked to anonymity online, which can sometimes be exploited by bad actors seeking to engage in harmful activity. However, restricting all users’ right to anonymity, by introducing compulsory user verification for social media, could disproportionately impact users who rely on anonymity to protect their identity. These users include young people exploring their gender or sexual identity, whistleblowers, journalists’ sources and victims of abuse. Introducing a new legal requirement, whereby only verified users can access social media, would force these users to disclose their identity and increase a risk of harm to their personal safety.

Furthermore, users without ID, or users who are reliant on ID from family members, would experience a serious restriction of their online experience, freedom of expression and rights. Research from the Electoral Commission suggests that there are 3.5 million people in the UK who do not currently have access to a valid photo ID.

The online safety regulatory framework will have significant measures in place to tackle illegal and legal but harmful anonymous abuse. Services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content, including criminal anonymous abuse. Major platforms will also need to set out clearly what legal anonymous content is acceptable on their platform and stick to it. The government will set out priority categories of legal but harmful material in secondary legislation.

Users will also be better able to report harmful content, and expect to receive an appropriate response from the company. This may include, for example, the removal of harmful content, or sanctions against offending users. Compliance with the online safety framework will be enforced by Ofcom, who will have a suite of powers to use against companies who fail to fulfil the duty of care. These include fines on companies - of up to £18m or 10% of annual global turnover - and business disruption measures. The Online Safety Bill, which will give effect to the regulatory framework outlined in the full government response, will be ready this year.

Protecting children is at the heart of our plans to transform the online experience for people in the UK and the strongest protections in this framework will be for children. All companies in scope will be required to assess whether children are likely to access their services, and if so, provide additional protections for them. They will be required to assess the nature and level of risk of their service specifically for children, identify and implement proportionate mitigations to protect children, and monitor these for effectiveness. We expect companies to use age assurance or age verification technologies to prevent children from accessing services which pose the highest risk of harm and to provide children with an age appropriate experience when using their service.

The police already have a range of legal powers to identify individuals who attempt to use anonymity to escape sanctions for online harms, where the activity is illegal. The government is also working with law enforcement to review whether the current powers are sufficient to tackle illegal anonymous abuse online. The outcome of that work will inform the government’s future position in relation to illegal anonymous online abuse.

The Government has also asked the Law Commission to review existing legislation on abusive and harmful communications. The Law Commission has consulted on proposed reforms and a final report is expected in the summer. We will carefully consider using the online harms legislation to bring the Law Commission’s final recommendations into law, where it is necessary and appropriate to do so.

Anonymity underpins people’s fundamental right to express themselves and access information online in a liberal democracy. Introducing a new legal requirement for user verification on social media would unfairly restrict this right and force vulnerable users to disclose their identity. The Online Safety legislation will address harmful anonymised activities online and introduce robust measures to improve the safety of all users online.

Department for Digital, Culture, Media and Sport

This is a revised response. The Petitions Committee requested a response which more directly addressed the request of the petition. You can find the original response towards the bottom of the petition page: https://petition.parliament.uk/petitions/575833

Parliamentary Debate

This petition has reached the threshold for a Parliamentary debate, but the debate has not yet been scheduled.

Have your say on this petition!

comments powered by Disqus

repetition.me is a Good Stuff website

3.141.100.120 Tue, 23 Apr 2024 18:27:00 +0100