Regulator tells charities to consider having an internal artificial intelligence policy

02 Apr 2024 News

By Shutter2U/Adobe

Charities may need to consider having an internal artificial intelligence (AI) policy, the Charity Commission has suggested in a newly published blog.

It notes the 2023 Charity Digital Skills report suggested that 35% of charities were already using AI for certain tasks and that a further 26% had plans to do so in the future.  

The blog on charities and artificial intelligence reads: “Whether you are a trustee already using AI, are planning to do so or don’t yet know how it might be useful, it is important that you are aware of the opportunities and risks involved.

“The key consideration is that AI should be used responsibly in a way that furthers your charity’s purposes,” it reads.

Currently, the regulator does not anticipate producing specific new guidance on the use of AI, preferring “to encourage trustees to apply our existing guidance to new technologies as they emerge”. 

It will update guidance where appropriate to reference examples of new technology, the regulator added.

‘Think about the advantages and risks’

AI can help charities free up time spent on resource-intensive tasks, and so make more hours available for high-priority areas, the blog states.

For example, generative AI, which uses prompts from humans to create both written and picture content, is amongst the fastest expanding areas. 

“Some charities are finding the writing tools helpful for fundraising materials, bid writing, speeches or drafting policies, while ‘speech to text’ tools take meeting minutes. There are also emerging opportunities to use AI directly in service delivery,” it says.
The blog states charities should consider how they might use AI, and if the options available are right for the charity. 

“Think about the advantages and risks – and how these would be managed – in the context of your trustee duties and charity objectives,” the blog states.

“That could involve looking at what gaps can be filled or insights generated by an AI tool, what skills are needed to use these tools to your charity’s advantage and if people within the charity’s trustees, staff or volunteers have those skills.

“You can also consider how staff or volunteers may already be using AI.”

‘Proceed with caution’

As the use of AI develops and more applications become available, the Commission recommends charities consider if having an internal AI policy would be beneficial so it is clear how and when it can be used in governance, by employees in their work, or in delivering services to your beneficiaries. 

It adds charities need to “proceed with caution as there are risks involved” some of which are inherent to the way AI is built, operates, and continues to learn. 

“AI is a work in progress so won’t always get things right. It is not yet sophisticated enough to give accurate legal advice, for example, and Generative AI models can confidently produce inaccurate, plagiarised, copyright infringing or biased results without any awareness that the results it has offered may be problematic.” 

The blog notes trustees remain responsible for decision making and “it is vital this process is not delegated to AI” alone. 

“We will expect trustees and others in charities to ensure that human oversight is in place to prevent material errors, and also as the human touch is key to the way many charities operate and interact with their beneficiaries,” it adds.

It also warns of external risks and reputational damage arising from the misuse and recirculation of AI, such as fake news or deep fakes.

The blog concludes “this evolving technology seems daunting to many” but there are opportunities for charities to engage with the technology now it is more widely available.

For more news, interviews, opinion and analysis about charities and the voluntary sector, sign up to receive the free Civil Society daily news bulletin here.



More on