Translate page with Google

Pulitzer Center Update July 4, 2025

AI Accountability Fellowship Open for Applications

Country:

Author:
This image shows an individual with orange hair interacting with a large, abstract digital mirrored structure. The structure is composed of squares in varying shades of green, orange, white, and black which are pieced together to reflect the individual’s figure. The figure's hand is extended as if pointing to or interacting with the mirrored structure. Behind the  structure are streams of binary code (0s and 1s) in orange, flowing towards the digital grid.
English

The story behind Amsterdam's abandoned efforts to use AI to detect welfare fraud

author #1 image author #2 image
Multiple Authors
Image
Applications due August 11, 2025


Inside Amsterdam’s ‘Responsible’ Algorithm

Is it actually possible to create an algorithm that is fair, transparent, and effective? While we’ve read and supported many stories in the past that have shown poor choices and processes that have led to discriminatory and harmful algorithms, I was intrigued by our most recent piece co-authored by AI grantee Eileen Guo for MIT Technology Review.

The project, reported in partnership with Lighthouse Reports and Dutch newspaper Trouw, analyzed multiple versions of an algorithm developed by the city of Amsterdam to evaluate welfare applicants for fraud. The team behind it had followed the procedures that people suggest for creating ethical AI systems: consulting experts, running bias tests, implementing safeguards.

Despite multiple iterations, they found that the system still contained biases when they ran their pilot. The project was scrapped, but not before the city estimated it spent €500,000 on it. In the end, one big question stood out: How is fairness defined and who gets to define it?

Over the last few weeks, the Pulitzer Center’s AI Accountability Network has supported stories about AI all over the world, from the Netherlands to Paraguay, Madagascar, and Brazil. These stories have made transparent the supply chains behind AI models and their immediate effects on communities and workers.

The impact of these stories continue to emerge, from Google removing an app featuring AI-generated child sexual abuse material from its store to our investigation being a new addition to course reading lists.

Applications are now open for the fourth cohort of the AI Accountability Fellowship. Over the last three years, we’ve supported 27 Fellows to report in 22 countries, and we’re looking for more impactful investigations into how companies, governments, and people develop and deploy AI.

We are holding an Ask Me Anything webinar for journalists looking to apply this year on Friday, July 18. Sign up if you’re interested in hearing more from me and 2024 AI Fellow Sofia Schurig about how to apply and her experience in the Fellowship.

Best,

Image
Joanna S. Kao signature

Impact

“The most important thing I learned in this project is how photojournalism stories extremely impact our country by bringing awareness,” Eiliyah, a student at Capitol Hill Montessori in Washington, D.C., reported after a visit to see her photograph in the ninth annual Everyday DC student photography exhibition. “After this project, I am inspired to continue working as a photojournalist and create more stories bringing awareness.”

Eiliyah is one of over 200 students from 10 D.C. Public Schools (DCPS) who participated in the Everyday DC photojournalism unit during the 2024-2025 school year. Designed by the Pulitzer Center and DCPS Arts in 2016, the unit is taught annually by DCPS visual arts teachers with support from Center staff and journalists.

Over 671 people visited the exhibition before it closed on June 5, 2025, including 75 students who took part in workshops hosted by Pulitzer Center grantee Ashonti Ford and Center staff.

Read more about the exhibition here.


Photo of the Week

Image
Cindie Haakenson is seen through a window of her home as the family farm is reflected before her on May 21, 2024, in Willow City, North Dakota. Despite a preference to remain at home, Haakenson's husband, Sherwood Haakenson, needed to move to a 24-hour long-term care center. From the story “The Cost of Senior Care: Why Aging Farmers Fear the Nursing Home.” Image by Tim Evans/NPR. United States.

“I’m grateful to have met Sherwood and Cindie Haakenson shortly before Sherwood passed. Their health journey speaks to the financial uncertainties and anxieties faced by many older Americans in rural communities, where medical access is steadily eroding and bills aren’t always covered by insurance. To me, the Haakensons' story underscores a profoundly tragic aspect of U.S. healthcare: exorbitant costs can leave people feeling a sense of relief when a loved one dies.”

— Tim Evans


This message first appeared in the July 4, 2025, edition of the Pulitzer Center's weekly newsletter. Subscribe today.

Click here to read the full newsletter.

RELATED INITIATIVES

Logo: The AI Accountability Network

Initiative

AI Accountability Network

AI Accountability Network

RELATED TOPICS

an orange halftone illustration of a hand underneath a drone

Topic

AI Accountability

AI Accountability