Take part in the 2026 Charity CRM Software Survey!

Share your opinions and receive the published report for free. One lucky person will also win a £100 John Lewis gift card. Deadline for submissions is 18th March.

Take part here

Rise in AI ‘poverty porn’ hurting charity sector, research warns

06 Mar 2026 News

An AI-generated image from UEA's report

Charity Right

Artificial intelligence (AI) is increasingly being used to fabricate images of poverty in charity campaigns, which research argues has reinforced harmful tropes and undermined sector credibility. 

A study from the University of East Anglia (UEA) revealed a rise in AI images, with poverty as the dominant theme, depicted in photorealistic form to enhance campaigns.

UEA found that AI was being used to recreate “harmful tropes associated with poverty, race and vulnerability,” rather than contest them, in its Artificial Authenticity report.

It analysed 171 AI-generated images from 17 voluntary organisations ranging from large INGOs to small grassroots charities. 

Poverty recurred most with 51 images flagged. “Poverty emerges as the overwhelmingly dominant theme, accounting for almost one-third of all documented images,” the report stated.

This comes after a Guardian investigation in October 2025, which revealed aid agencies and charities were increasingly using AI images of extreme poverty, children and sexual violence survivors in their campaigns. 

UEA said the charity sector would suffer from the loss of public trust stemming from unchecked AI image use.

“The ethical implications extend beyond individual organisational practice to the sector’s collective credibility,” the report stated.

“As [AI] adoption accelerates, this lack of standardisation poses risks not only to individual organisational credibility but to donor trust in the sector as a whole.”

The study also highlighted that AI was used to create 35 environmental images and 32 relating to human rights, both falling within the theme of vulnerability.

“The data shows that AI imagery is overwhelmingly deployed in contexts traditionally associated with human vulnerability and humanitarian need.”

As a result, discourse moved away from the humanitarian causes of charities and towards debates on technology and trust in regard to AI use.

“The introduction of AI fundamentally alters audience attention,” the report argued. 

“Discussion frequently shifts away from the humanitarian mission toward the medium, ie technology, its ethics, and execution.”

Some 141 comments in the research, sourced from public social media threads where AI charity images were shared, focused on AI ethics and authenticity concerns.

The next highest category brought up was technical execution and visual quality, which pertained to 122 comments.

Only 80 comments, less than 20%, engaged with the humanitarian issue itself, research found.

Dangers to the charity sector

UEA argued that the increased use of AI to depict human vulnerability in charity campaigns could hurt the sector as a whole.

On the effect of a rise in such AI use, the report stated: “This is significant, given the sector’s decades-long struggle with critiques of poverty porn and exploitative representation.”

The result of charities using AI to show poverty, rather than images of systemic solutions, suggested a “persistent belief” that poverty remained essential for audience engagement, the report warned.

Lack of transparency when using these AI images was another concern outlined.

The report found: “More than one in 10 AI-generated images circulating in charity communications were presented without acknowledgement of their synthetic nature.”

UEA stated that this non-disclosure represented a “direct breach” of transparency.

It added: “The lack of transparency about generative AI usage becomes particularly concerning when layered against the thematic findings.”

The report argued that AI images of poverty could prompt audiences to donate under false pretences: giving money based on fabricated evidence they believe to be documentary.

A similar report which includes guidance for charities using AI images was produced by filmmaking agency The Saltways and published last month.

It found: “Trust is the foundation of charity fundraising, and once eroded it’s extremely difficult to rebuild.

“One charity’s misuse becomes the sector’s problem, as media coverage rarely distinguishes between individual organisations.”

The Saltways report added that most charities it spoke to were concerned about supporter reactions to AI use.

For more news, interviews, opinion and analysis about charities and the voluntary sector, sign up to receive the free Civil Society daily news bulletin here.

More on