Consideration of Bill, as amended, on re-committal, in the Public Bill Committee
[Relevant documents: Second Report of the Petitions Committee of Session 2021-22, Tackling Online Abuse, HC 766, and the Government response, HC 1224; Letter from the Chair of the Women and Equalities Committee to the Minister for Tech and the Digital Economy regarding Pornography and its impact on VAWG, dated 13 June 2022; Letter from the Minister for Tech and the Digital Economy to the Chair of the Women and Equalities Committee regarding Pornography and its impact on VAWG, dated 30 August 2022; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account; e-petition 582423, Repeal Section 127 of the Communications Act 2003 and expunge all convictions; e-petition 601932, Do not restrict our right to freedom of expression online.]
We have discussed this issue already in this Chamber. I thank charities and campaigners such as the National Society for the Prevention of Cruelty to Children for raising awareness and for being constructive and assiduous. I also thank the families who, through voicing their own pain and suffering, have given impetus to this issue. I thank those on the Front Bench; it is fair to say that I have had constructive dialogue with the Minister and the Secretary of State. They listened to our concerns and accepted that this issue had to be addressed.
As we debate this new clause and other aspects of the Bill, we should begin as we did last time by thinking of those who face tragedy and distress as a result of accessing inappropriate content online. Children and vulnerable people have been failed by tech companies and regulation. We have the duty and responsibility to step up and tighten the law, and protect children from online harms, exploitation and inappropriate content. That must be at the heart and centre of a lot of the legislation—not just this Bill but going forward. Throughout the various debates, and at Committee stage, we have touched on the fact that technology is evolving and changing constantly. With that, we must keep on building upon insights.
New clause 2 does simple and straightforward things. It makes senior managers liable and open to being prosecuted for failing to proactively promote and support the safety duties in clause 11. As it stands, the Bill’s criminal liability provisions fall short of what is expected or required. Criminal liability for failing to comply with an information notice from Ofcom is welcome. Ofcom has a very important role to play—I do not need to emphasise that any more. But the Bill does not go far enough, and Ministers have recognised that. We must ensure that all the flaws and failings are sanctionable and that the laws are changed in the right way. It not just about the laws for the Government Department leading the Bill; it cuts across other Government Departments. We have touched on that many times before.
More than 80% of the public agree that senior tech managers should be held legally responsible, to prevent harm to children on social media. That is a statement of the obvious, as we have seen such abhorrent and appalling harms take place. Around two thirds want managers to be prosecuted when failures result in serious harm. But harm can happen prior to an information notice being issued by Ofcom—again, we have discussed that.
The public need assurances that these companies will have the frameworks and safeguards to act responsibly and be held to account so that children and vulnerable individuals are protected. That means meaningful actions, not warm words. We should have proactivity when developing the software, algorithms and technology to be responsive. We must ensure that measures are put in place to hold people to account, and that sanctions cover company law, accountability, health and safety and other areas. Ireland has been mentioned throughout the passage of this Bill. That is important. My colleagues who will speak shortly have also touched on similar provisions.
It is right that we put these measures in the Bill for the serious failures to protect children. This is a topical issue. In fact, a number of colleagues met tech companies and techUK yesterday, as did I. We have an opportunity to raise the bar in the United Kingdom so that technology investment still comes forward and the sector continues to grow and flourish in the right way and for the right reasons. We want to see that.
We know that the internet is magnificent and life-changing in so many ways, but the dark corners remain a serious concern with regard particularly to children, but also to scores of other vulnerable people. Of course, the priorities of this Bill must be to protect children, to root out illegal content, and to hold the online platforms
to account and ensure they are actually doing what they say they are doing when it comes to the dangerous content on their sites. I warmly welcome the Minister and the Secretary of State’s engagement on these particular aspects of the Bill and how they have worked really hard to strengthen it.This legislation is so vital for our children. The National Society for the Prevention of Cruelty to Children has estimated that more than 21,000 online child sex crimes have been recorded by the police just in the time this legislation has been delayed since last summer.
I want to focus on new clause 2. I have said before, and am happy to repeat it, that the individual criminal liability provided for in the Bill as it stands is too limited. Attaching it to information offences only means that, in effect, very bad behaviour cannot be penalised under the criminal law as long as the perpetrator is prepared to provide Ofcom with information about it. That cannot be sensible so there is a strong case for extending criminal liability, but new clause 2 goes too far. There are, fundamentally, two problems with new clause 2.
First, new clause 2 is drafted too broadly. It would potentially criminalise any breach of a safety duty under clause 11, the clause relating to children. We all,
of course, think that keeping children safer online is a core mission of the Bill. I hope Ministers will consider favourably various other amendments that might achieve that, including the amendments in the name of the noble Baroness Kidron, which the hon. Member for Pontypridd (Alex Davies-Jones) mentioned earlier, in relation to coroners and all services likely to be accessed by children. Clause 11 covers a variety of different duties, including duties to incorporate certain provisions in terms of service and to ensure that terms of service are clear and accessible. Those are important duties no doubt, but I am not convinced that any and all failures to fulfil them should result in criminal prosecution.This issue is also relevant and linked to the wider debate around legal but harmful that we have had today and in the recommittal Committee, because if we are going to have criminal sanctions for non-compliance, we need to be really clear what companies are supposed to do. It needs to be really clear to them what they have to do as well. That is why, when the Joint Committee produced its report, we recommended that the legal but harmful provisions in the Bill should be changed. They do not do what many people in the House have asserted they do, which is to set standards and requirements for companies to remove legal content. They were never there to do that. They provided risk assessment for a wider range of content, and that may have been helpful, but they did not require the removal of content that was neither a breach of the community standards of the platform nor a breach of the legal threshold.
The changes to the Bill help in some ways with the idea of having criminal liability because written on to the face of the Bill are the offences that are within scope, what the companies have to do and also the requirement to enforce their own terms of service where the safety standards are defined not by law but by the platform. Safety standards are important, and there is sometimes a danger in this debate that we pretend they do not really exist. That is understandable, because companies are not very good at enforcing them. They are not very good at doing the things they say they will do. As a former board member of the Centre for Countering Digital Hate, I am pleased to hear that organisation being cited so often in the debate. Its chief executive gave evidence to the Joint Committee, in which he said that if there was one thing it could do, it would be to ensure that companies enforced their own terms of service. He said that if there were a legal power to make them do that, many of the problems we are discussing would go away. That is a very important sanction.
On the point around smaller platforms, in reality Ofcom has the power to enforce safety standards at the level set in the Bill on any platform of any size. On the question of smaller platforms being out of scope, they can only take enforcement based on the terms of service, and platforms like that are likely to have very weak or practically non-existent terms of service. That is why having the minimum safety standards based in law is so important.
With regard to advertising, my hon. Friend and constituency neighbour the Member for Dover (Mrs Elphicke) has an amendment relating to immigration offences that are promoted through advertising. The additional amendment that the Government are accepting relating to advertising in banning the promotion of conversion therapy is also important.
The reason I am keen to highlight these points today stems from a tragic case in my constituency, which I have raised in the House on more than one occasion. Joe Nihill, a popular former Army cadet, was aged 23 when he took his own life after accessing dangerous, suicide-related content online. As I have mentioned previously, his mother, Catherine, and his sister-in-law, Melanie, have run a courageous campaign to ensure that, when this legislation becomes law, what happened to Joe does not happen to others.
For much of the passage of the Bill, I have been heartened. In particular, speaking to the previous Front-Bench Government team, it felt like we were going in the right direction, but perhaps not as quickly as we
would like. However, the Government amendments mean that we are now heading in the wrong direction. Joe’s mother and sister-in-law are heartbroken at the Government’s current direction of travel on the Bill in relation to protecting adults from harmful but legal content. I urge the Minister to think again, because Government amendments have gutted harmful but legal protections for adults. Reckless amendments mean that sites will not even have to consider the risk that harmful but legal content poses to adult users on their platform. As I have said, Bills are meant to get better as they go through Parliament. With the Government’s amendments, we have seen the opposite happen.Research from the Samaritans shows that just 16% of people think that access to potentially harmful content on the internet should be restricted only for children. As I have said, my constituent, Joe, was 23. We all know that it is false to presume that people stop being vulnerable at the age of 18. There are so many vulnerable adults in our society, and there are also people who become vulnerable when they see these things online—when they are dragged down this online rabbit hole of dangerous, harmful content.
The importance of including harmful but legal content is clear. Content that is legal but undoubtedly harmful includes information, instructions and advice on methods of self-harm and suicide, and material that portrays self-harm and suicide as desirable. Crudely removing protections from harmful content at 18 years of age leaves vulnerable people exposed to potentially fatal content.
As we have heard today, individual filters are simply not enough to protect vulnerable people. The Government have set out that it is up to individuals to filter legal but harmful content, but, often, people experiencing suicidal thoughts will look for ways to take their own life and actively seek out harmful content.
In conclusion, the truth is that the Government have ignored the real-world expertise of groups such as the Samaritans and others in order to put the interests of tech giants first as well as those on the Tory Back Benches who put so-called freedom of speech ahead of the safety of people like Joe from my constituency who took his own life at the age of 23.
I hope to see further work on this Bill in the other place to ensure that vulnerable adults are given the protection that they deserve. That was Joe’s parting wish in the letter that he left to his family—that what happened to him would not happen to others. Let us not lose this opportunity. Let us improve the Bill. The other place has a vital role to play in ensuring that the Bill improves and protects everybody.
There is an obligation on us to protect children, especially lone children who find themselves not in the protection of social services, either here or abroad, but in the hands of evil people smugglers and people traffickers. I hope that whatever our differences may be across this House on how open or otherwise our borders and migration system should be, we should be united in
compassion, concern and action for children and young people in the snare of this wicked criminal activity. That is what my amendment 82 seeks to ensure.Turning briefly to other amendments, new clause 2 seeks to hold senior managers to account. I am grateful to my hon. Friends the Members for Stone (Sir William Cash) and for Penistone and Stocksbridge (Miriam Cates) for their excellent work on this. I was somewhat disappointed to read the comments, repeated today by the hon. Member for Pontypridd (Alex Davies-Jones), that it is some kind of weakness for Government to agree to amendments. I particularly wanted to comment on that in relation to new clause 2.
In deciding to support new clause 2, I was persuaded by the remarks of the right hon. Member for Barking (Dame Margaret Hodge) in the previous Report stage. I am grateful to her for the strength of her comments and their persuasive nature. It is our job here in this House to make sure that we consider and make responsible amendments. That is what those of us on the Government Benches have sought to do. I am very pleased that the Government have moved in relation to new clause 2, and it is important to recognise that it shows the confidence and strength of leadership of the Prime Minister, his Ministers, the Culture Secretary and Ministers in her Department and the Home Office, as well as the Solicitor General, that they will work with us to ensure that the Bill is stronger yet.
Finally, I turn to amendment 83 in the same spirit. I was moved by the personal account and the comments made by my right hon. Friend the Member for Chelmsford (Vicky Ford) on Report, and that is why I lent my support to her amendment. She has made a powerful case that it is important to protect children, but also to recognise, as has been said, that as children turn 18 they may still be extremely vulnerable and in need of support. I thank her for that, and I know that a number of Members feel likewise.
In conclusion, I thank the Culture Secretary and the Minister, my hon. Friend the Member for Sutton and Cheam (Paul Scully), for their engagement to date and for the commitment made in the written ministerial statement to strengthen the Bill in relation to the prevention of modern slavery and illegal immigration, including for the protection of children. On that basis, I confirm that I will not be moving amendment 82 later today.
I really welcome the plans to introduce measures to strengthen individual criminal liability for directors of tech companies. The concerns raised by some that that will deter investment in the UK or result in a tech exodus are absolute nonsense. That has not taken place in Ireland, which still hosts many leading industry headquarters. I am a former tech entrepreneur, part of the founding team of one of the UK’s largest software publishers, and I assure the House that this forward-leaning legislation and regulation that requires innovation to solve compliance and user problems is exactly what drives the engineers to do what they do best: to solve
problem and develop solutions, in the interests of their customers, which are valuable across a host of industries and sectors.The new measures will cement the UK’s role as a world leader in this space and underpin our ability to continue to play a leading role in the software industry. Where we lead, others will follow. They will also give our thought leaders the opportunity to develop bespoke solutions as well as ensure that children’s ages are verified robustly and that disgusting child sex abuse material is removed and does not proliferate.
On violence against women and girls, sensible and workable plans have been set out to make coercive and controlling behaviour a priority offence and to make platforms take that stuff down without women and girls having to contact them every single time. I welcome the work on creating codes of practice.
Considering the challenges that surround this area, the Bill does a really good job of protecting and upholding the freedom of speech that we hold dear in our democracy. As a feminist, I need to be able to express my view, protected under the Equality Act, that biological sex is immutable. I should not be hounded off the internet or threatened with violence for stating that view. At the same time, we should all seek to support and improve the experiences of transgender people. We can do both at the same time. We must have a nuanced, balanced and compassionate debate.
We are in an era where our discussion forums have become polarised. We are crossing new frontiers but we cannot accept the status quo. Our democracy depends on this.
Proceedings interrupted (Programme Order, 5 December, and Standing Order No. 24(7)),
The Deputy Speaker put forthwith the Questions necessary for the disposal of the business to be concluded at that time (Standing Order No. 83E).
New Clause 4
Safety duties protecting adults and society: minimum standards for terms of service
“(1) OFCOM may set minimum standards for the provisions included in a provider’s terms of service as far as they relate to the duties under sections 11, [Harm to adults and society risk assessment duties], [Safety duties protecting adults and society], 12, 16 to 19 and 28 of this Act (“relevant duties”).
(2) Where a provider does not meet the minimum standards, OFCOM may direct the provider to amend its terms of service in order to ensure that the standards are met.
(3) OFCOM must, at least once a year, conduct a review of—
(a) the extent to which providers are meeting the minimum standards, and
(b) how the providers’ terms of service are enabling them to fulfil the relevant duties.
(4) The report must assess whether any provider has made changes to its terms of service that might affect the way it fulfils a relevant duty.
(5) OFCOM must lay a report on the first review before both Houses of Parliament within one year of this Act being passed.
(6) OFCOM must lay a report on each subsequent review at least once a year thereafter.”—(Alex Davies-Jones.)
Brought up.
Question put, That the clause be added to the Bill.
Clause 5
Overview of Part 3
Amendment made: 1, in clause 5, page 4, leave out lines 41 and 42.—(Paul Scully.)
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Clause 6
Providers of user-to-user services: duties of care
Amendments made: 2, in clause 6, page 5, line 15, leave out “, (3) and (4)” and insert “and (3)”.
This amendment removes a reference to what was subsection (4) of clause 18, as that provision has been moved to clause 65.
Amendment 3, in clause 6, page 5, line 26, leave out paragraphs (a) and (b).—(Paul Scully.)
This amendment is consequential on the removal of clauses 12 and 13 of the Bill as amended on Report.
Clause 10
Children’s risk assessment duties
Amendment made: 4, in clause 10, page 8, line 38, leave out from “8” to “)” in line 40.—(Paul Scully.)
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Clause 12
User empowerment duties
Amendment proposed: 43, in clause 12, page 12, line 24, leave out “made available to” and insert
“in operation by default for”.—(Kirsty Blackman.)
Question put, That the amendment be made.
Clause 36
Codes of practice about duties
Amendment made: 5, page 38, line 6, leave out paragraph (c).—(Paul Scully.)
This amendment is consequential on the removal of clause 13 of the Bill as amended on Report.
Clause 46
Duties and the first codes of practice
Amendment made: 6, page 45, line 23, leave out paragraph (c).—(Paul Scully.)
This amendment is consequential on the removal of clause 13 of the Bill as amended on Report.
Clause 56
Regulations under sections 54 and 55: OFCOM’s review and report
Amendments made: 7, page 54, line 11, leave out ‘or 55’.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 8, page 54, line 15, leave out sub-paragraph (ii).
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 9, page 54, line 18, leave out ‘individuals’ and insert ‘children’.—(Paul Scully.)
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Clause 89
OFCOM’s register of risks, and risk profiles, of Part 3 services
Amendments made: 10, page 79, line 14, leave out paragraph (d).
This amendment is consequential on the removal of the adult safety duties.
Amendment 11, page 79, line 31, leave out subsection (6).
This amendment is consequential on the removal of the adult safety duties.
Amendment 12, page 80, line 5, leave out ‘or (d)’.
This amendment is consequential on the removal of the adult safety duties.
Amendment 13, page 80, leave out lines 15 and 16.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 14, page 80, leave out lines 20 and 21.—(Paul Scully.)
This amendment is consequential on the removal of the adult safety duties.
Clause 90
OFCOM’s guidance about risk assessments
Amendments made: 15, page 80, line 36, leave out subsection (4).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 16, page 81, leave out lines 13 and 14.—(Paul Scully.)
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Clause 197
Parliamentary procedure for regulations
Amendment made: 17, page 162, line 26, leave out paragraph (b).—(Paul Scully.)
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Clause 205
“Harm” etc
Amendments made: 18, page 169, line 35, leave out ‘or adults’.
This amendment is consequential on the removal of the adult safety duties.
Amendment 19, page 169, line 35, leave out ‘or adults (as the case may be)’.—(Paul Scully.)
This amendment is consequential on the removal of the adult safety duties.
Clause 208
Index of defined terms
Amendments made: 20, page 173, leave out line 16.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 21, page 174, leave out lines 6 and 7.
This amendment removes the reference in the index to the “maximum summary term for either-way offences”, as that term no longer appears in the Bill.
Amendment 22, page 174, leave out lines 24 and 25.
This amendment is consequential on the removal of clause 55 of the Bill as amended on Report.
Amendment 23, page 175, line 13, at end insert—
“restricting users’ access to content (in Part 3) | section 52”.—(Paul Scully.) |
This amendment adds a definition of “restricting users’ access to content” to the index of defined terms.
Schedule 3
Timing of providers’ assessments
Amendments made: 24, page 189, line 37, leave out paragraph 6.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 25, page 191, line 10, leave out ‘to 14’ and insert ‘and 13’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 26, page 191, line 18, leave out sub-paragraph (3).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 27, page 191, line 22, leave out ‘to 14’ and insert ‘and 13’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 28, page 191, line 39, leave out paragraph 14.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 29, page 192, line 14, leave out ‘or paragraph 6’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 30, page 192, line 15, leave out
‘, CAA or adults’ risk assessment’
and insert ‘or CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 31, page 192, line 19, leave out ‘, 17 or 18’ and insert ‘or 17’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 32, page 192, line 21, leave out ‘and paragraph 6 apply’ and insert ‘applies’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 33, page 192, line 41, leave out paragraph 18.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 34, page 193, line 10, leave out paragraph (b).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 35, page 193, line 13, leave out
‘, a CAA or an adults’ risk assessment’
and insert ‘or a CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 36, page 193, line 25, leave out
‘, a CAA or an adults’ risk assessment’
and insert ‘or a CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 37, page 193, line 27, leave out ‘or paragraph 6.’
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 38, page 193, line 39, leave out paragraph (c).
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 39, page 193, line 41, leave out
‘, CAA or adults’ risk assessment’
and insert ‘or CAA’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 40, page 193, line 43, leave out ‘or paragraph 6’.
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Amendment 41, page 194, leave out lines 17 and 18.—(Paul Scully.)
This amendment is consequential on the removal of clause 12 of the Bill as amended on Report.
Schedule 4
Codes of practice under section 36: principles, objectives,
content
Amendment made: 42, page 198, line 19, leave out paragraph (c).—(Paul Scully.)
This amendment is consequential on the removal of clause 13 of the Bill as amended on Report.
Third Reading