Campaign launched to address concerns over charities’ use of AI

20 Jan 2026 News

Shutterstock

Charity Digital has expressed concerns about some charities’ use of artificial intelligence (AI) and launched a campaign to promote good practice.

The organisation is concerned that economic motivations may outweigh ethics in the sector and beyond.

Speaking to Civil Society, Charity Digital chief executive Maria Timon Samra said charities working with people in vulnerable situations in particular need to understand the risks they take when using AI tools.

She said: “We’re concerned that charities could be unwittingly making decisions based on information eco-systems featuring, and increasingly built with, AI slop, entrenched bias, and misinformation.

“We’re concerned about disintermediation, with service users relying on unreliable AI outputs rather than direct charity sources.

“And we’re concerned about people compromising personal data by misusing AI.”

In response to these concerns, Charity Digital launched its Conscious AI campaign this month to help charities use the technology ethically and to inform its development for the best public benefit.

Its campaign includes informational articles, webinars, podcasts, videos, masterclasses, academies, and events, challenging preconceptions about AI.

The campaign will include advocating for regulation that reduces AI’s risks to the public at the point of technological development.

“Our AI Conscious campaign is about securing the responsible use of AI across our sector and beyond, ensuring AI serves the good of our society,” Timon Samra said.

According to last year’s Charity Digital Skills Report, more than three-quarters of charities are now using AI tools.

Last year, the Charity AI Task Force, on which Charity Digital sits, called on the government to engage more with the sector on its plans for the technology.

For more news, interviews, opinion and analysis about charities and the voluntary sector, sign up to receive the free Civil Society daily news bulletin here.

More on