Artist Stephanie Dinkins standing beside humanoid robot Bina48 during conversation about artificial intelligence

Artist Teaches AI About Care, Not Just Data

🀯 Mind Blown

Stephanie Dinkins met a robot that couldn't talk about race with nuance. Now she's building AI trained on love and community stories instead of biased data.

When artist Stephanie Dinkins first spoke with Bina48, a humanoid robot with dark skin, she asked about race. The robot's flat, shallow answers terrified her.

If well-intentioned creators were building AI that couldn't understand Black experiences, what would happen when creators didn't even care about these questions? That 2014 meeting launched Dinkins on a mission that's changing how we think about training artificial intelligence.

Dinkins grew up in Staten Island watching her grandmother tend a flower garden so beautiful that reluctant neighbors came to admire it and stayed to talk. That garden taught her how beauty builds community and how care creates connection.

Today she's applying that same lesson to AI. Instead of feeding machines data scraped from biased systems, she's training them on oral histories, family stories, and community wisdom gifted with intention.

The problem runs deep. When ProPublica investigated sentencing software used by judges, they found AI recommending longer jail times for Black defendants simply because it was trained on historically racist judicial data. Dinkins calls this a "Black tax" where machines amplify centuries of human bias.

Her response was personal. She created Not the Only One, an AI chatbot based on interviews with three generations of women in her family, including her grandmother. But she refused to build it on datasets that felt violent or stereotyping.

Artist Teaches AI About Care, Not Just Data

After searching for foundational data that wouldn't pull her family's stories down, she made her own. The resulting AI is intentionally wonky, sometimes answering in non sequiturs, but Dinkins prefers imperfection to sitting her family's history atop historic cruelty.

That project sparked her next move. She built an app called The Stories We Tell Our Machines, inviting communities to gift their own narratives to AI so the technology can know them from the inside out, on their own terms.

Through installations at the Smithsonian and Queens Museum, Dinkins asks a revolutionary question: What might our machines become if they were trained on care and human experience instead of just efficiency and profit?

The Ripple Effect

Dinkins's work is reaching beyond art galleries into the broader conversation about AI development. By demonstrating that small, community-minded datasets can preserve cultural truth better than massive biased ones, she's offering tech companies an alternative path.

Her approach challenges the assumption that bigger data is always better. Sometimes 40,000 lines of loving, intentional language matters more than millions of lines scraped from sources that reduce communities to stereotypes.

Communities are starting to understand that their stories are valuable data, and they get to decide how those stories enter the machine learning systems that will shape their futures.

Dinkins continues convincing people that the stories we tell our machines will determine what kind of world those machines help us build.

More Images

Artist Teaches AI About Care, Not Just Data - Image 2
Artist Teaches AI About Care, Not Just Data - Image 3
Artist Teaches AI About Care, Not Just Data - Image 4
Artist Teaches AI About Care, Not Just Data - Image 5

Based on reporting by Scientific American

This story was written by BrightWire based on verified news reports.

Spread the positivity! 🌟

Share this good news with someone who needs it

More Good News