Recognising community power as a pillar of AI governance

There is no responsible AI without community power.

Good AI governance involves more than algorithmic transparency and shipping responsible products; it also requires monitoring, understanding and mitigating the continuous impacts of technologies as they play out in the world. 

Governments and industry have an important part to play in AI governance, but they cannot do it alone. Sensing, understanding and mitigating technological impacts is a fundamentally social action and responsibility, and the role of communities and civil society in co-ordinating and driving this must be formally recognised and funded.

Fig. 1: Diagram showing the role of Community Power in AI governance

Artificial intelligence is not a singular thing with a singular impact; it is a diverse set of technologies that power all kinds of products and services. These products, in turn, each have a wide range of social, economic, environmental and political impacts that interact with one another and the world around them in diverse ways, creating both predictable and unanticipated outcomes. This means it is essential that AI governance does not end at deployment; it must be ongoing and responsive. Achieving this is a complex, fundamentally social process that relies on a vibrant ecosystem of actors and interventions. 

In October 2023, Promising Trouble convened the AI and Society Forum, bringing together 150 civil society changemakers to shape better outcomes for AI. The agenda of the day was a mirror image of the subsequent discussions at Bletchley Park, shining a light on the very tangible social, political and economic impacts of AI, including discussions of the impacts of automated decision-making in the immigration, justice and welfare systems; the role of LLMs in normalising and scaling racism; algorithmic oppression in education; the impact on work and workers; and the use of AI in warfare. The richness of these conversations demonstrated that AI governance is not simply a technological or technocratic undertaking, but a social process that must be designed to curb real-world harms and produce more beneficial outcomes. After all, automation is an intensifier: as well as bundling efficiencies and convenience, it also intensifies existing power imbalances, often in ways that are difficult to interrogate. 

An auditorium with a speaker at the front (Dr Abeba Birhane), addressing a room full of people.

Dr Abeba Birhane addressing the AI and Society Forum (credit: Alexandra Vanotti)

Community power is a vital part of this system that comes into being through a combination of informal social organisation and formal interventions, often flexing in response to the matter at hand. Current governance activities performed by communities include sensing and understanding emerging harms and opportunities; policy influencing at local and international levels; and organising collective action and litigation.  This is largely self-supported, often relying on diverse income streams including membership fees, philanthropic funding for discrete projects, and social investment – none of which are anywhere near as abundant as the VC money that flows into AI development or as reliable as tax-payer funded governmental activity.

In spite of its financial precarity, community power is not a “nice to have” in a robust democratic system; it’s an essential part of responsible governance that thrives on multiplicity while also requiring independence from corporate influence. Addressing this funding gap is essential for developing functional AI governance. 

The recent announcement by VP Harris of a $200m philanthropic fund that will “protect democracy and the rights and freedoms of all people” and “leverage AI to innovate in the public interest” is a step towards this, but the risk of such a structure is that it will over-index on North American priorities. A representative and diverse community ecosystem is essential because the societal impacts of Silicon Valley values play out differently in different territories, where different cultural and legal norms abide; exporting a US-centric view of what good looks like is not sufficient mitigation for the new colonialism of Big Tech and it will not create platforms for the communities most under-represented in global power and governance. 

Moreover, domestic uses of AI by governments and government bodies in the delivery of public services requires local scrutiny and accountability. UK government technology policy has an increasingly precarious relationship to human rights which will not be sufficiently addressed by a focus on multi-stakeholder governance and international agreements; the UK needs a resilient, responsive set of communities and community organisations that can move at the speed of government policymaking to shine a light on abuses, advocate for alternatives, and support people in search of redress. This is currently low on the list of most funders’ priorities as the cost of living crisis continues, but the continued roll-out of the digital hostile environment in the UK will exacerbate inequalities for minoritised and vulnerable communities in ways that it may take years to unpick. 

Lastly, good governance must not solely focus on curbing the excesses of big tech companies or governments; it should also help articulate what good looks like. In an under-resourced ecosystem, the focus for community power tends to be on course correction, but innovation does not only happen in academic research labs and start-ups - they are just more likely to be funded than community organisations. Earlier this year we published “A thousand Cassandras”, a report for Open Society Foundations that highlighted the fact that digital civil society organisations are often under-estimated by industry and policymakers alike; meanwhile, the vibrant Community Tech ecosystem we support with Power to Change shows how much innovation and creativity is being used to solve community’s biggest challengesse-studies-page, from the green energy transition to affordable social care, access to good food and social inclusion. 

Community power must be at the heart of AI governance. It’s imperative that social investors and philanthropists recognise this, and pull together to offer significant funding to power more just and more equitable outcomes for AI; after all, communities cannot counter big tech or government policymaking machines if they are constantly anxious about their funding and resources. The cost of not investing is too significant.

Let’s invest in community power and make AI work for eight billion people, not eight billionaires.

Background research for this blog post was conducted by the Promising Trouble research team.

Previous
Previous

Putting the lid on Pandora’s Box: how community power shapes AI

Next
Next

We’re recruiting a Chairperson