Artificial Intelligence and Large Language Models

Response to the call for evidence for the post-legislative scrutiny inquiry of the UK Lobbying Act 2014

 

Introduction

This written submission is offered to the UK House of Commons Public Administration and Constitutional Affairs Committee with regards to its post-legislative scrutiny inquiry of the Transparency of Lobbying, Non-party Campaigning and Trade Union Administration Act 2014 (“the Lobbying Act”).

This text is submitted jointly by Westminster Foundation for Democracy and POPVOX Foundation.

Westminster Foundation for Democracy (WFD) is the UK public body dedicated to strengthening democracy around the world, established by the UK government in 1992. WFD plays an important role in guiding parliaments, political parties, civil society organisations, community leaders and electoral management bodies to help build democratic systems and practices in developing countries around the world. Its aim is to contribute to the establishment of stable, inclusive, and pluralistic democracies that respect human rights and uphold democratic values. For over 30 years, WFD has been instrumental in fostering democratic change, ensuring the voices of citizens are heard, and promoting a culture of human rights and gender equality in numerous countries.

POPVOX Foundation is an American nonprofit organisation with a mission “to inform and empower people and make government work better for everyone.” This includes reimagining the concept of civic infrastructure, and providing new ways for government to share information and engage the public, with an emphasis on diverse participation and rebuilding public trust. The organisation recognizes the critical importance of an informed citizenry and endeavours to provide platforms and resources that enable voices from all segments of society to be heard. Through its initiatives, the foundation aims to make the legislative process more accessible, understandable, and interactive for the public, while also aiding legislators in receiving clear and organised feedback from their constituents.

Both organisations are participants in a newly established multi-disciplinary working group on “Artificial Intelligence and Parliaments” and collaborate with colleagues around the world to share information and best practices on how representative bodies can prepare for challenges and opportunities that may arise with the widespread adoption of automated tools such as large language models (LLMs).

Related to this work, we are honoured to share insights on how the rise of Artificial Intelligence (AI) technologies, especially LLMs, may impact lobbying activities and transparency, especially in relation to potential reassessment of the UK Lobbying Act 2014.

Rapid Rise of AI and Calls for Regulation

The rapid increase in publicly available automated technologies over the past year is provoking considerable uncertainty and raising concerns that they may contribute to the spread of disinformation in a manner similar to disruptions fueled by unregulated social media.

As governments begin to contemplate potential guard rails for the development of these technologies, regulators will be challenged to balance risks and opportunities. Several relevant bills are being discussed in national parliaments. In some cases, national legislatures have passed laws for specific AI applications such as autonomous systems, drones and face recognition. In April 2021, the European Commission brought forth a proposal for the Artificial Intelligence Act that has recently been approved by the European Parliament. In 2020, the Parliamentary Assembly of the Council of Europe embraced a set of practical proposals, in the form of resolutions and recommendations, to counterbalance the risks posed by the application of AI in democratic rights and freedoms.

The UK government has put forward its own proposals on the future regulation of AI. The UK House of Lords Select Committee on Artificial Intelligence adopted a comprehensive report with recommendations, among others on the development of new approaches to the auditing of datasets used in AI. The Science, Innovation and Technology Select Committee of the UK House of Commons is currently examining the effectiveness of AI governance and the Government’s proposals.

Beyond general regulation of the technologies, governments will face questions about how these new technologies should be incorporated into their governing systems, including the legislative process. Considering the capabilities of these and other emerging technologies, reconsideration of the UK Lobbying Act 2014 to directly address the use of automated technologies may be in order.

One can distinguish three categories of AI tools and mechanisms. Firstly, there are recognition-focused AI tools, such as text recognition, sound recognition, face recognition, translation services, etc. Secondly, there is AI enabled recommendations, forecasts, situational awareness, and automated decision making. A third category is generative AI or “GenAI,” such as text generation (ChatGPT and other LLMs), sound generation, image and video generation, etc.

Given the proliferation of GenAI since December 2022, it may be that 2022 is the last year in human history in which we can be certain that new texts were written by humans. While the addition of GenAI tools to the field of lobbying and advocacy may not be problematic in itself, these tools do raise new questions that were not contemplated at the time of the original drafting of the 2014 Lobbying Act.

In a March 2023 post in the MIT Technology Review, scholars Nathan E. Sanders and Bruce Schneier cautioned that LLMs could "pose significant threats to democratic processes" as they enable "high quality political messaging quickly and at scale." Combined with targeting, this "could become a powerful accelerant of an already sprawling and poorly constrained force in modern democratic life: lobbying."

Without question, LLMs will allow rapid and automated production of lobbying materials. As Stanford legal scholar John Nay describes, LLMs can already "systematically assess bills, explain bill impacts, and draft lobbying letters with minimal training data." Their capabilities will only improve.

The introduction of new automated technologies within the public consultation ecosystem may include positive results. As Bridget C. E. Dooling and Mark Febrizio of the Brookings Institution write, “Generative AI might help people structure their information and views into public comments that have a better chance of influencing [government] decisions.” This sort of AI-enabled editing might help individuals better format their input to government and level the playing field for public engagement opportunities.

However, it is also likely that LLMs will be used to overrun government systems. Recent advances in computer vision and machine learning have “dramatically increased” the ability of bots to solve CAPTCHA* tests (Searles et al., 2023).  Dooling and Febrizio explain that new automated technologies such as LLMs will make it much easier to create a large volume of comments (both genuine and fake) that could potentially flood government consultation processes:

Combining generative AI with mass comment campaigns could lead to mass comments that look less duplicative in nature, as the same idea could be expressed in many different ways with some support from an AI tool. Agencies currently have access to language processing tools that allow them to group comments based on the similarity of their text. This helps agencies meet their burden under the law to consider and respond to all significant comments. More varied comments will strain the current set of tools and likely lead to increased agency resources dedicated to comment analysis. That could further slow the already cumbersome rulemaking process, as agencies figure out how to cope with large and overwhelming volumes of differentiated and ostensibly substantive comments. For advocates looking to gum up the works, this could be an appealing tactic.

Already in the U.S., researchers at Cornell University acknowledged generating “32,000 policy advocacy emails to over 7,000 state legislators around the country, on six issues: guns, reproductive rights, education, health care, policing, and taxes.” The ostensible purpose was to test “the extent to which machine-generated content can pass as constituent sentiment.” By all measures, the machine-generated content was assumed by the lawmakers’ offices to be legitimately generated by humans.

Because the Lobbying Act only covers direct consultant lobbying, LLM-enabled indirect lobbying could occur without transparency or accountability.

* CAPTCHA tests intend to support web hosts to distinguish between human and automated access to websites.

Does the 2014 Lobbying Act Cover AI-generated Communications?

The Act makes provisions for establishing and maintaining a register of persons** carrying on the business of lobbying and to require those persons to be entered in the register. The Act also specifies that communications are oral or written communications made personally to a Minister of the Crown or permanent secretary.

If the AI-generated communications are part of oral or written communications conducted by lobbyists on behalf of clients, one can make the case that the Act's registration requirements for lobbyists should apply, even if the communications are generated by AI. The key is whether the communication is intended to be directed to a minister or permanent secretary and relates to government policy, legislation, the award of contracts, grants, licences, or similar.

The underlying principle of the Lobbying Act is to enhance transparency and accountability. If AI-generated communications are used, it could be argued that they should still be subject to transparency requirements, regardless of the method of generation.

However, legislation may sometimes lag behind technological advancements, and the application of existing legislation to emerging technologies like AI can be challenging. It seems thus prudent to ensure that laws and regulations are adapted or interpreted in new ways to accommodate these developments. There is a case to ensure that the UK national lobbying legislation covers the use of AI in a more explicit way.

** As a matter of statutory interpretation, ‘persons’ would cover natural and legal persons (e.g. companies, firms). The key question, which any revision to the Lobbying Act could address, is not whether AI is a "person" but to ensure that a "person" is deemed to be legally accountable for the use of all relevant tools and techniques/lobbying modalities deployed on their behalf/on behalf of their clients.

Policy Recommendations

Therefore, we recommend the Public Administration and Constitutional Affairs Committee urges the government to:

  • Expand the Lobbying Act's scope to cover in a more explicit way AI-assisted lobbying communications and require transparency disclosures for them.

  • Consider that the revised legislation could cover a wider range of government officials, regulators, and potentially legislators, reflecting the fact that AI enables a much wider range of stakeholders to be targeted.

  • Expand the register to include disclosure of lobbying-related income and expenditure (including on AI).

  • Promote broader in-person civic participation and participatory opportunities, thus ensuring that constituent voices continue to be expressed in direct contact as well, counterbalancing any rise in LLM lobbying.

  • Take steps to ensure that the government can handle an increase in correspondence volume and associated cyber security risks likely to follow widespread adoption of automated technologies.

  • Develop inclusive and accessible online security checks for public consultations to prevent manipulation by AI/LLM whilst remaining accessible to civic actors and the public, and consider the introduction of penalties for impersonation (e.g. of constituents) without consent - to avoid the unknowing use of AI tools on their behalf.

We appreciate the opportunity to share this evidence and welcome any future opportunity to contribute to discussions about the potential impact and opportunities regarding emerging technologies in relation to government and parliamentary systems.

Sincerely,

Anthony Smith, Chief Executive, Westminster Foundation for Democracy

Marci Harris, Executive Director, POPVOX Foundation

Cited Research and Articles

Previous
Previous

POPVOX Foundation welcomes Aubrey Wilson as Director of Government Innovation

Next
Next

Congressional Intern Diversity