Report (2nd Day)
Relevant documents: 3rd Report from the Constitution Committee, 9th and 12th Reports from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
Clause 90: Duties of the Commissioner in carrying out functions
Amendment 38
Moved by
38: Clause 90, page 113, line 15, at end insert “in accordance only with the Commissioner’s duties under section 108 of the Deregulation Act 2015 (exercise of regulatory functions: economic growth).”
Member’s explanatory statement
This amendment ensures that the Commissioner’s duty to have regard to the desirability of promoting innovation is referable only to the duty imposed under section 108 of the Deregulation Act 2015. This amendment seeks to ensure that the Commissioner’s status as an independent supervisory authority for data protection is preserved given that such status is an essential component of any EU adequacy decision.
Amendment 38 withdrawn.
Amendment 39 not moved.
Amendment 40
Moved by
40: Clause 90, page 113, line 20, after “children” insert “merit specific protection with regard to their personal data because they”
Member's explanatory statement
This amendment adds an express reference to children meriting specific protection with regard to their personal data in new section 120B(e) of the Data Protection Act 2018 (Information
Commissioner’s duties in relation to functions under the data protection legislation). See also the amendment in my name to Clause 70, page 78, line 23.Amendment 40 agreed.
Amendments 41 to 43 not moved.
Amendment 44
Moved by
44: After Clause 92, insert the following new Clause—
“Code of practice on Children's Data and Education(1) The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on the processing of data in connection with the provision of education.(2) Guidance under subsection (1) must include consideration of—(a) all aspects of the provision of education including learning, school management and safeguarding;(b) all types of schools and learning settings;(c) the need for transparency and evidence of efficacy on the use of AI systems in the provision of education;(d) the impact of profiling and automated decision-making on children’s access to education opportunities;(e) the principle that children have a right to know what data about them is being generated, collected, processed, stored and shared;(f) the principle that those with parental responsibility have a right to know how their children's data is being generated, collected, processed, stored and shared;(g) the safety and security of children’s data;(h) the need to ensure children's access to and use of counselling services and the exchange of information for safeguarding purposes are not restricted.(3) In preparing a code or amendments under this section, the Commissioner must have regard to—(a) the fact that children are entitled to a higher standard of protection than adults with regard to their personal data as set out in the UK GDPR, and the ICO’s Age Appropriate Design code;(b) the need to prioritise children's best interests and to uphold their rights under UN Convention on the Rights of the Child and General Comment 25;(c) the fact that children may require different protections at different ages and stages of development;(d) the need to support innovation to enhance UK children's education and learning opportunities, including facilitating testing of novel products and supporting the certification and the development of standards;(e) ensuring the benefits from product and service developed using UK children’s data accrue to the UK.(4) In preparing a code or amendments under this section, the Commissioner must consult with—(a) children,(b) educators,(c) parents,(d) persons who appear to the Commissioner to represent the interests of children,(e) the AI Safety Institute, and(f) the relevant Education department for each nation of the United Kingdom.(5) The Code applies to data processors and controllers that—
(a) are providing education in school or other learning settings;(b) provide services or products in connection with the provision of education;(c) collect children's data whilst they are learning;(d) use education data, education data sets or pupil data to develop services and products;(e) build, train or operate AI systems and models that impact children’s learning experience or outcomes;(f) are public authorities that process education data, education data sets or pupil data.(6) The Commissioner must prepare a report, in consultation with the EdTech industry and other stakeholders set out in subsection (4), on the steps required to develop a certification scheme under Article 42 of the UK GDPR, to enable the industry to demonstrate the compliance of EdTech services and products with the UK GDPR, and conformity with this Code.(7) Where requested by an education service, evidence of compliance with this Code must be provided by relevant providers of commercial products and services in a manner that satisfies the education service's obligations under the Code.(8) In this section—“EdTech” means a service or product that digitise education functions including administration and management information systems, learning and assessment and safeguarding, including services or products used within school settings and at home on the recommendation, advice or instruction of a school;“education data” means personal data that forms part of an educational record.“education data sets” means anonymised or pseudonymised data sets that include Education Data or Pupil Data.“efficacy” means that the promised learning outcomes can be evidenced.“learning setting” means a place where children learn including schools, their home and extra-curricular learning services for example online and in-person tutors.“pupil data” means personal data about a child collected whilst they are learning which does not form part of an educational record.“safety and security” means that it has been adequately tested.“school” means an entity that provides education to children in the UK including early years providers, nursery schools, primary schools, secondary schools, sixth form colleges, city technology colleges, academies, free schools, faith schools, special schools, state boarding schools, and private schools.”Member's explanatory statementThis amendment proposes a statutory Code of Practice on Children and Education to ensure that children benefit from heightened protections when their data is processed for purposes relating to education. Common standards across the sector will assist schools in procurement.
Without slipping into issues that we will debate in a moment, it is important to record here that on page 19 of the AI consultation the Government have proposed including works created by children in the course of their education—for example, essays, art, science products and musical creations—as part of their proposal to make IP-protected works freely available to AI web scrapers and other AI interests under their data mining exception rules. My understanding of the proposal is that they mean freely available in both senses. Those who specialise in the area of education are very shocked by this suggestion. One wrote to me and said:
“children go to school for the state to enable their right to education, not to enable their exploitation for data mining. This is an absolute no”.
I want the Minister to explain whether this is now the price of a school-based education. Is this a decision the Government have made?
I cannot see a reasonable way for a child to opt out of such an arrangement, which is at odds with current advice, which, I note, was updated only last week and says:
“It is recommended that personal data is not used in generative AI tools”;
and that
“Schools and colleges must not allow or cause students’ original work to be used to train generative AI models unless they have permission, or an exception to copyright applies … Exceptions to copyright are limited, and settings may wish to take legal advice to ensure they are acting within the law”.
This is advice to teachers that they could not possibly implement. It is a giveaway of a child’s right to privacy and with it their safety and their autonomy. I have been inundated on this exact point, so I really would be grateful for the noble Lord to explain the extent to which this is going to happen and what boundaries the Minister sees to this outcome.
Would the Minister also say whether he is confident that the ICO’s consultation on edtech will be meaningful. I say this in light of the Online Safety Act, which has enraged dozens of organisations, large and small, which provided extensive evidence and opinion that has all been summarily dismissed by Ofcom, which has watered down codes in spite of evidence to the contrary and failed to act on provisions agreed in this House. I really hope to be pleased and proud that the Government have chosen to have an edtech code, so I would like some reassurance on these points.
Finally, I just want to say to the House that I was, by chance, on a call with children from all across the world on the weekend, and their primary concern was that technology, including AI, was shaping their world for the worse. Children are asking that school be a place of security, safety and freedom, without the extractive or pushy qualities that characterise tech in the rest of their lives. I hope the Minister is willing to commit to that when he responds. I beg to move.
Amendment 44 withdrawn.
Clause 94: Analysis of performance
Amendment 44A
Moved by
44A: Clause 94, page 119, line 1, at end insert—
“(1) In the 2018 Act, in section 139, after subsection (2) insert—“(2A) The report must include an assessment of the Commissioner’s performance of the duties assigned to it by regulations under section (Enforcement) of the Data (Use and Access) Act 2025.””
These amendments set out how a copyright regime could work. Amendment 61 would ensure that all operators of web crawlers must comply with UK law if they are marketed in the UK. Amendments 62 and 63 would require operators to be transparent about their identity and purpose, and allow creatives to understand if their content had been stolen. Amendment 64 would give enforcement powers to the ICO and allow for a private right of action by copyright holders. Amendment 44A would require the ICO to report on its enforcement record. Finally, Amendment 65 would require the Secretary of State to review technical solutions that might support a strong copyright regime. These are practical, sensible amendments that could support a valuable industry while looking forward to new technical efficiencies as they emerge.
To all the Members of the other place who talked warmly last night about the creativity of their own communities, of individual artists and small companies, or of centres excellence, I ask the question: how will they survive if their livelihood is dependent on chasing after AI bots that have scraped their opt-out work? Can they survive if the only way to own their own work is to opt out of the primary arena of distribution, sales and archiving? What do they think about galleries and museums: should they also opt out to protect their artists? If so, what does that do to tourism, of which the Minister last night was so proud to announce that the arts were the primary driver?
Before I sit down, I will quickly mention DeepSeek, a Chinese bot that is perhaps as good as any from the US—we will see—but which will certainly be a potential beneficiary of the proposed AI scraping exemption. Who cares that it does not recognise Taiwan or know what happened in Tiananmen Square? It was built for $5 million and wiped $1 trillion off the value of the US AI sector. The uncertainty that the Government claim is not an uncertainty about how copyright works; it is uncertainty about who will be the winners and losers in the race for AI. In rejecting the amendments, which could, in a matter of weeks, protect the income of the UK’s second most valuable industrial sector, the Government are pushing us into an uncertain and unsustainable future in which anyone with $5 million is to be entitled to the spoils of the UK creative industry.
I must finish, but I have to thank Paul McCartney, Elton and David, Lord Lloyd-Webber, Jeanette Winterson, Kate Mosse, Sir Simon Rattle, Richard Osman, Kate Bush and all the other 40,000 artists, musicians, writers and supporters who have added their names to this fight. I thank the News Media Association and the overwhelming number of creative arts organisations that have voiced their support. On their behalf, I ask the Government what their plans are to support all of us when we no longer have a living. What is the root of the Government’s soft power when they have to confront the diminishing returns of synthetic material that will be in direct competition to UK creative industries but could not be built without our IP? What is the Government’s answer to the young people who say they cannot have a creative life because there is no prospect of an income stream?
These amendments were urgent today, but the Government’s consultation has given permission to continue large-scale theft of property rights. The spectre of AI does nothing for growth if it gives away what we own so that we can rent from it what it makes. I beg to move.
I speak as a career journalist and TV producer who has seen a systematic theft of media content by tech companies. Six years ago, my noble friend Lady Kidron and I were on the then Communications Committee. We were investigating the use of British journalistic content by social media companies to aggregate news on their platforms without giving users the provenance of the content or paying the media companies for republication of their information.
Now, with the arrival of RAG, the AI tool which can ensure that AI models have access to live news and information, the tech companies can browse the web to extract valuable content from journalistic websites and respond to users’ questions with the most up-to-date information. Once again, the tech companies are, in most cases, not paying for the use of data to train their models, nor giving users any idea of the provenance of that information. I understand that some AI companies have done deals with publishers, but that most, when offered the opportunity to do so, have refused to license content. They see it as another cost of business that they should not have to incur.
This is a continuum of years of piracy and theft by the tech companies against British content makers. Some big media companies have taken on the tech companies for breach of copyright, but it is very expensive. In the first nine months of its lawsuit against Open AI and Microsoft for copyright infringement to train their AI models, the New York Times spent $7.6 million, a sum way beyond the resources of many content creators in the UK. To compound the theft, the tech companies say they use so much data in training their models that is often not possible to identify the provenance of that data. However, in many cases the source of the data is being deliberately obscured by the AI companies.
In a US case against Meta for pirating content to develop its Llama model, the allegation is that not only did it use the book piracy website Library Genesis— I suspect many noble Lords will be as surprised as I am that such a site exists—but internal Meta emails show that the tech company went to some lengths to
obscure the origin of the data. One Meta employee suggested removing copyright heads and document identifiers, including any lines containing “ISBN”, “copyright” and “all rights reserved”. Meta emails suggested that removing such metadata would “reduce legal complications”.I know that this Government are desperate to bring the AI revolution to this country and see it as a source of huge economic growth, but if the tech companies are deliberately refusing to license data or, worse still, obscuring the data they have used, the opt-out suggestion in the AI consultation is going to be useless—and worse than that, as many other noble Lords have said.
These amendments are needed to ensure that the AI companies adhere to the copyright law and, in the process, ensure the future of our world-beating creative industries. If the noble Baroness, Lady Kidron, calls a vote, I will be voting for Amendment 44A, and I encourage other noble Lords to join me in the “Contents” Lobby.
I do not underestimate how hard the challenge is to chart this course. As my noble friend just said, it is important that we remain balanced, because we do not want to turn our back on the technology, but it needs to be transparent, and there needs to be a clear market and enforcement. All those things are in these amendments.
If someone owns a host of bricks lying at the end of the street and I use them to build a house, I should pay for those bricks. I honestly believe that, if electricity were available for free without meters, the big tech companies would use that electricity without paying. It is only because we have a means to force them to pay—to be clear about what they are using and to make sure that there is a trading market—that they are not. These amendments do that and, as my noble friend said, they do it in a way that is not prescriptive. I urge the Government to listen to the genuine cross-party support for these amendments.
From what I understand, chatbots—I have asked a few—do not need to have copyright modern literature in their training sets to be able to learn natural languages, or even narrative structure. Image and music generators do not need to be trained on copyright artistic works to be able to create images, designs or sounds for a user.
If a user wants to use generative AI as part of their creative process, as we have heard about on several occasions this evening, they can give a model ring-fenced access to their own works or to specific works that they have permission to use, just as a scientific researcher can do with the data that they have access to. The model does not need to have been trained on copyright material beforehand.
We have had a couple of mentions of DeepSeek already this evening. Another lesson that has become very clear with the launch of that model is that what drives the future is innovation and creativity. From what I know about DeepSeek, it is the creativity of its inventors that has allowed it to set new benchmarks for efficiency. It is creativity and innovation that is put at risk by failing to protect creators through copyright, IP and data protection laws.
There are many potential uses for generative AI—we are in a period of early exploration—but I ask the Minister to think long and hard before giving up the protection of creatives and innovators with respect to their ideas and their works in the service of the claims of need from generative AI manufacturers. We need to interrogate why they really need copyrighted works in their training sets, and what service they are really going to deliver as a result of having them. If it is a matter of technical difficulties around not being able to differentiate copyrighted works, that is a problem to
be solved, not a reason for abandoning copyright protection. And the people who solve problems are the creatives—the people whose livelihoods are under threat.The technical nature of these issues and their potential impact for rights holders and AI developers—we have heard this expressed very clearly—means that we need to consider them carefully. We also need to ensure that our approaches are compatible with, and indeed help shape, international solutions on transparency, access controls and metadata. That is why we are asking about all these elements in the consultation.
The question of how to achieve enforcement and compliance with any new approach is also of great importance. This is the subject of another amendment put forward by the noble Baroness, Lady Kidron.
The Government do not believe in making changes to the status quo, unless they are confident that any new approach will work in practice. Appropriate measures for compliance and enforcement are a crucial part of that. We are open-minded to how exactly they should be achieved and we welcome responses as part of the consultation process.The noble Earl, Lord Devon, raised important points about copyright. As we grapple with these difficult questions in the UK, we cannot and should not ignore the position in other countries. Japan and Singapore view web crawling and data mining as “non-consumptive” of a copyrighted work, and so provide few if any restrictions on it. The USA considers “fair use” on a case-by-case basis, and multiple lawsuits are being considered with no clear pattern emerging. The EU also has an opt-out system, but one that is now having to evolve to incorporate exactly the types of transparency and ease of use that we have committed to as part of our consultation.
As a consequence of what is happening already, models are being trained on UK-owned content in other countries and this is likely to continue. We could legislate to make the UK’s approach to copyright the strictest in the world, but it would not change this reality worldwide. What it would do is make it harder to develop AI technology in the UK and models developed in other countries would no longer be available here. At the same time, many rights holders would still be unable to control use of their work or seek payment for it. We would have no ability to influence other approaches, such as the EU’s, or to shape international rules and standards.
We acknowledge that the EU’s approach does not currently meet our objectives and that further work is needed on transparency, standards and other areas. But new technologies and standards are in rapid development, and the international rules are already being shaped by others. We need to be at the table to make sure that they work for our creatives and AI industries in the UK.
Rest assured, the Government understand the very strong and legitimate concerns creators and rights holders have about how their content is used by the AI sector and how powerless they often feel. We want to create stronger, practical ways for them to control the use of content and greater transparency over how it is used, as well as creating the right conditions for AI innovation. There need to be workable solutions—workable for the creators as well as the AI companies.
I accept the important point raised by the noble Baroness, Lady Freeman, on the need for high-quality data in order to get the best outcomes. We are committed to addressing these challenges, as demonstrated by the detailed consultation we published before Christmas. Legislating on transparency, web crawlers, watermarks or other issues without evidence on their impact or the type of technologies, oversight and enforcement needed to make them work would be premature.
Of course there have been assessments of the impact. Indeed, an initial impact assessment was published alongside the consultation, but we absolutely recognise that more evidence is required. That is one of the calls we have made.
A point was made around jobs. Earlier this week, I attended the launch of the Pissarides report on the impact of AI on jobs across all sectors. This provides an extraordinarily potent and important piece of information to take into account as we take this forward. I will say in passing that the figures provided in the assessment of the AI industry came from a number of different sources, and we have used many different approaches to understanding the impact of the AI sector. The specific one asked about by the noble Lord, Lord Holmes, was public first, and the methodology is in the public domain.
To further show our commitment on this issue, I will be pleased to update the House on progress and to set out next steps very soon after the consultation has closed on 25 February. Noble Lords may be aware that I called for early clarity on this matter in my 2023 review for the previous Government. I hope that we can move swiftly to reach much-needed certainty and fairness. I hope that noble Lords will allow us the chance to properly conclude our consultation process and bring forward comprehensive proposals as a result. As such, I ask the noble Baroness to withdraw her amendment.
Clause 101: Annual report on regulatory action
Amendment 45 not moved.
Amendment 46
Moved by
46: After Clause 104, insert the following new Clause—
“Review of court jurisdictionWithin one year of the day on which this Act is passed the Secretary of State must review the impact that transferring the jurisdiction of courts that relate to all data protection provisions to tribunals would have on—(a) the complexity of the appeals system, and(b) legal barriers to representation and redress.”
Amendment 46 withdrawn.
Amendment 47
Moved by
47: After Clause 107, insert the following new Clause—
“Data use: defences to charges under the Computer Misuse Act 1990(1) The Computer Misuse Act 1990 is amended as follows.(2) In section 1, after subsection (3) insert—
“(4) It is a defence to a charge under subsection (1) to prove that—(a) the person’s actions were necessary for the detection or prevention of crime, or(b) the person’s actions were justified as being in the public interest.”(3) In section 3, after subsection (6) insert—“(7) It is a defence to a charge under subsection (1) in relation to an act carried out for the intention in subsection (2)(b) or (c) to prove that—(a) the person’s actions were necessary for the detection or prevention of crime, or(b) the person’s actions were justified as being in the public interest.””Member’s explanatory statementThis amendment updates the definition of “unauthorised access” in the Computer Misuse Act 1990 to provide clearer legal protections for legitimate cybersecurity activities.
Amendment 47 withdrawn.
Amendment 48 not moved.
Clause 109: Interpretation of the PEC Regulations
Amendment 48A
Moved by
48A: Clause 109, page 139, line 19, at end insert—
““service message” means a communication necessary for an administrative or servicing purpose including the performance of a contract to which the recipient is party, or in order to take steps at the request of the recipient prior to entering into a contract which does not contain any direct marketing content;“regulatory communication” means a communication necessary for the compliance with a legal obligation or legislative measure, including those provided by a statutory regulator, which aims to improve customer outcomes and avoids active promotion or encouragement where possible following careful assessment of the risk of harms caused, or likely to be caused, to the recipient;”
Amendment 48A withdrawn.
Schedule 12: Storing information in the terminal equipment of a subscriber or user
Amendment 48B not moved.
Amendment 49
Moved by
49: After Clause 112, insert the following new Clause—
“Use of electronic mail for direct marketing by charities(1) Regulation 22 of the PEC Regulations (use of electronic mail for direct marketing purposes) is amended as follows.(2) In paragraph (2), after “paragraph (3)” insert “or (3A)”.(3) After paragraph (3) insert—“(3A) A charity may send or instigate the sending of electronic mail for the purposes of direct marketing where—(a) the sole purpose of the direct marketing is to further one or more of the charity’s charitable purposes;(b) the charity obtained the contact details of the recipient of the electronic mail in the course of the recipient—(i) expressing an interest in one or more of the purposes that were the charity’s charitable purposes at that time; or(ii) offering or providing support to further one or more of those purposes; and(c) the recipient has been given a simple means of refusing (free of charge except for the costs of the transmission of the refusal) the use of their contact details for the purposes of direct marketing by the charity, at the time that the details were initially collected, and, where the recipient did not initially refuse the use of the details, at the time of each subsequent communication.”(4) After paragraph (4) insert—“(5) In this regulation, “charity” means—(a) a charity as defined in section 1(1) of the Charities Act 2011,(b) a charity as defined in section 1(1) of the Charities Act (Northern Ireland) 2008 (c. 12 (N.I.)), including an institution treated as such a charity for the purposes of that Act by virtue of the Charities Act 2008 (Transitional Provision) Order (Northern Ireland) 2013 (S.R. (N.I.) 2013 No. 211), and
(c) a body entered in the Scottish Charity Register, other than a body which no longer meets the charity test in section 7 of the Charities and Trustee Investment (Scotland) Act 2005 (asp 10),and, in relation to such a charity, institution or body, “charitable purpose” has the meaning given in the relevant Act.””Member’s explanatory statementRegulation 22 of the PEC Regulations prohibits the transmission, by means of electronic mail, of unsolicited communications to individual subscribers. This amendment creates an exception from the prohibition for direct marketing carried out by a charity for charitable purposes.
Amendment 49 agreed.
Amendments 50 and 50A not moved.
Clause 123: Information for research about online safety matters
Amendment 51
Moved by
51: Clause 123, page 153, line 14, leave out “may by regulations” and insert “must, as soon as reasonably practicable and no later than 12 months after the day on which this Act is passed, make and lay regulations to”
Member’s explanatory statement
This amendment removes the Secretary of State’s discretion on whether to lay regulations under Clause 123 and sets a time limit for laying them before Parliament.
Ministers here and in another place need to remember that, while they are in Parliament, they are making laws to control their successors as well as their departments. They are not really sitting in here as part of their departments. They may be heads of those departments, but the laws they pass are to control their departments and themselves, and we need to remember that from time to time. What happened over the Digital Economy Act was shameful and the Executive should hang their head in shame and pass something such as this to show their good will in the future.
Amendment 51 withdrawn.
Amendments 52 to 56 not moved.
Consideration on Report adjourned until not before 8.07 pm.