Why the most radical act in an age of artificial intelligence is insisting on being fully human.

There is a war being waged right now. Not with weapons, but with policy, with rhetoric, with the deliberate erasure of the word “equity” from government websites and the quiet gutting of programs designed to make us see one another. The target isn’t abstract. The target is empathy itself.

And into this moment, with almost absurd timing, comes the most consequential technological shift in human history: artificial intelligence woven into nearly every system we touch. Robots in our warehouses. Algorithms making decisions about our healthcare, our bail, our job applications. A digital nervous system spreading across civilization, built by humans, reflecting human choices — and increasingly, human indifference.

This is either the worst possible moment to abandon empathy as a value. Or the most important.

I believe it is both. And I believe that is precisely the point.

What Is EmpathyTech?

EmpathyTech is an emerging framework — part design philosophy, part ethical imperative, part field of practice — that asks a deceptively simple question: What if we built technology that made us more human, not less?

It sits at the intersection of human-centered design, affective computing, trauma-informed development, and immersive storytelling. It draws from cognitive science research showing that narrative exposure genuinely builds perspective-taking capacity — the measurable neurological ability to inhabit another’s experience. It draws from disability justice and queer theory. It draws from the hard-won wisdom of communities who have always had to make themselves legible to a world that preferred not to see them.

EmpathyTech isn’t sentiment. It isn’t a feelings-first tech bro pivot. It is a rigorous, intentional commitment to building systems that center the interior lives of human beings — especially the ones the default has always excluded.

“Empathy is not a soft skill. It is the architecture beneath every system that has ever actually worked for people.”

Think of it this way: UX design (user experience design) became a mainstream discipline because someone finally asked, what is it like to be the person using this thing?EmpathyTech asks that question at a civilizational scale — and insists the answer include people who have historically been rendered invisible.

The Machine That Doesn’t Feel

Here is what we know about large language models, the engines behind ChatGPT and Claude and the rest: they are extraordinarily good at pattern recognition. They predict. They mirror. They synthesize. What they do not do — cannot do, at least not yet, and perhaps not ever in the way we mean it — is feel.

A language model trained on the internet learns, statistically, what humans say. But the internet is not a representative sample of humanity. It over-indexes on the young, the English-speaking, the connected, the dominant. It carries forward every bias baked into the corpus of human text: the racism, the erasure, the centering of the comfortable. Feed that into a machine, ask the machine to make decisions, and you don’t get neutral intelligence. You get bias at scale, at speed, with the false authority of objectivity.

This is not a hypothetical. It is already happening. Algorithmic bias in facial recognition systems has been documented to misidentify Black faces at rates up to 34 percentage points higher than white faces (MIT Media Lab, 2018). Hiring algorithms trained on historical data perpetuate historical discrimination. Predictive policing models encode systemic racism into “neutral” risk scores. The machine isn’t evil. The machine reflects us — and we have not been honest about who “us” includes.

If we are building minds that reflect human values, then the question of whose humanity we encode is not philosophical. It is the whole ballgame.

The Conservative War on Feeling

Meanwhile, something deeply intentional is happening in the political sphere. The dismantling of DEI programs. The banning of books that center Black, queer, and marginalized voices. The executive orders stripping federal language of terms like “equity,” “inclusion,” and “gender.” The framing of empathy as weakness — or worse, as indoctrination.

This is not a coincidence colliding with the rise of AI. This is a pincer movement. Defund the human programs that cultivate empathy. Simultaneously deploy technology that encodes the pre-empathy status quo at scale. The result: a world that feels modern and efficient while systematically disadvantaging the same people it always has — now with the added insult of algorithmic legitimacy.

The queer community understands this better than most. We have always lived at the intersection of erasure and visibility, of having to make our interior lives legible to a world that wished we didn’t exist. We know what it costs to be left out of the default. We know what it means when the systems that are supposed to serve everyone quietly serve some people much better than others.

That knowledge is not baggage. It is expertise. And it is exactly the expertise that EmpathyTech needs.

Humanism as the Most Radical Act

There is a temptation, when the world moves this fast and the stakes feel this high, to become a techno-pragmatist. To accept that scale wins, that efficiency wins, that the machine is inevitable and we may as well optimize for what we can control. I understand that temptation. I have felt it.

But humanism — the philosophical tradition that centers the value, dignity, and agency of human beings as the foundation of ethics and meaning — is not a relic. It is the load-bearing wall. Remove it and everything else collapses, no matter how elegantly engineered.

The science backs this up. Research in affective neuroscience — the study of the brain’s emotional systems — consistently shows that emotional processing is not separable from rational decision-making. The famous work of neurologist Antonio Damasio demonstrated that patients with damage to the brain’s emotional centers became catastrophically bad decision-makers, not better ones. Empathy is not the opposite of clear thinking. It is a prerequisite for it.

A society that engineers empathy out of its systems does not become more rational. It becomes more dangerous.

What We Build Matters. Who Builds It Matters More.

This is where Queer Reflection lives. This is why the work of immersive storytelling — of placing a person inside an experience they have never had, of making the abstract viscerally real — is not just art. It is infrastructure.

EmpathyTech as a field will grow whether the queer community is in the room or not. The question is whether the people building it will understand what they are building it for. Whether they will have sat with the discomfort of stories that don’t center them. Whether they will have learned, through some form of encounter, that another person’s experience is not a data point but a world.

We can be the ones who teach that. We have been practicing it our entire lives.

Empathy is not the enemy of progress. It is the only version of progress worth having.

Don’t Give In. Don’t Give Up.

I want to say something plainly, to anyone reading this who is tired. Who has watched the news and felt the specific exhaustion of seeing your existence treated as a political football. Who has wondered whether the fight for a more humane world is a fight that can be won when the machinery of dehumanization has become so large and so fast.

The moment we stop insisting on empathy is the moment we hand them the future.

Every storyteller, every designer, every developer who asks “who is missing from this?” is doing EmpathyTech. Every educator who refuses to reduce a student to a test score. Every algorithm auditor who demands that a system be fair before it is deployed. Every queer person who tells their story — not to explain themselves, but to be witnessed — is building the architecture of a more humane world, one encounter at a time.

The robots are coming. The AI is already here. The question is whether it will arrive in a world that has kept its soul or traded it for efficiency.

That is not a question technology can answer. It is a question only we can.

Story is resistance.
Empathy is power.
We have always known this.

Queer Reflection exists because some truths can only be felt before they can be understood.
This is one of them.

 

QUEER REFLECTION  ·  SAN FRANCISCO, CA

Pin It on Pinterest