All quotes from the conclusion to this book:

Poverty in America is not invisible. We see it, and then we look away.

The same is true of everywhere. Poverty tends to go entirely ignored, even when we pretend we're doing something about it.

Our relationship to poverty in the United States has always been characterized by what sociologist Stanley Cohen calls “cultural denial.” Cultural denial is the process that allows us to know about cruelty, discrimination, and repression, but never openly acknowledge it. It is how we come to know what not to know. Cultural denial is not simply a personal or psychological attribute of individuals; it is a social process organized and supported by schooling, government, religion, media, and other institutions.

This relationship has most certainly been acquired from Europe, even as they claim to be "social" democracies. It's absolutely necessary to recognise these views as being tied to the values that the former colonisers brought with them, particularly those who brought the Puritanical bullshit with them.

It's more clear as we watch Europeans, who are going through austerity, pointing at the United States as being "worse than them" with regards to surveillance and bigotry. They hide behind their GDPR, and they pretend they're superior when they sweep their bigotries under the rug.

Denial is exhausting and expensive. It is uncomfortable for individuals who must endure the cognitive dissonance required to both see and not-see reality. It contorts our physical geography, as we build infrastructure—suburbs, highways, private schools, and prisons—that allow the professional middle class to actively avoid sharing the lives of poor and working-class people. It weakens our social bonds as a political community; people who cannot meet each others’ eyes will find it very difficult to collectively govern.

It's so easy to see how these denials continually ruin the world around us. It is exhausting.

For example, the Great Railroad Strike of 1877 dramatized not just the suffering of the poor but also their immense political power. Poor and working people’s activism terrified elites and won significant accommodations: a return to a poor-relief system focused on distributing cash and goods and a move away from institutionalization. But almost immediately, scientific charity rose to take its place. The techniques changed—scientific casework focused on investigation and policing rather than containing the poor in quasi-prisons—but the results were the same. Tens of thousands of people were denied access to public resources, families were torn apart, and the lives of the poor were scrutinized, controlled, and imperiled.

So many people ask why things like third-places or other community-oriented centers have disappeared. This is why. If we all had ways to interact and create stronger connections, we'd be organising for better circumstances far more often.

When we talk about the technologies that mediate our interactions with public agencies today, we tend to focus on their innovative qualities, the ways they break with convention. Their biggest fans call them “disruptors,” arguing that they shake up old relations of power, producing government that is more transparent, responsive, efficient, even inherently more democratic.

This myopic focus on what’s new leads us to miss the important ways that digital tools are embedded in old systems of power and privilege. While the automated eligibility system in Indiana, the coordinated entry system in Los Angeles, and the predictive risk model in Allegheny County may be cutting-edge, they are also part of a deep-rooted and disturbing history. The poorhouse preceded the Constitution as an American institution by 125 years. It is mere fantasy to think that a statistical model or a ranking algorithm will magically upend culture, policies, and institutions built over centuries.

Like the brick-and-mortar poorhouse, the digital poorhouse diverts the poor from public resources. Like scientific charity, it investigates, classifies, and criminalizes. Like the tools birthed during the backlash against welfare rights, it uses integrated databases to target, track, and punish.

They are disruptors, but not in the ways that have been marketed to us.

No poverty regulation system in history has concentrated so much effort on trying to guess how its targets might behave. This is because we, collectively, care less about the actual suffering of those living in poverty and more about the potential threat they might pose to others.

I think this is... true and not at the same time? Now we have automated algorithms that enable this guesswork (on things you can't predict) to happen, but people still did it in their own assessments. Schools were notorious for this.

But it was still possible to prove people wrong, when you can't do that to a machine.

While they are close kin, the differences between the poorhouse of yesterday and the digital poorhouse today are significant. Containment in the physical institution of a county poorhouse had the unintentional result of creating class solidarity across race, gender, and national origin. When we sit at a common table, we might see similarities in our experiences, even if we are forced to eat gruel. Surveillance and digital social sorting drive us apart as smaller and smaller microgroups are targeted for different kinds of aggression and control. When we inhabit an invisible poorhouse, we become more and more isolated, cut off from those around us, even if they share our suffering.

And there it is. The digital poorhouse is part of developing a society of disconnected people, whereas accidentally allowing us to organise is bad.

The digital poorhouse is hard to understand. The software, algorithms, and models that power it are complex and often secret. Sometimes they are protected business processes, as in the case of the IBM and ACS software that denied needy Hoosiers access to cash benefits, food, and health care. Sometimes operational details of a high-tech tool are kept secret so its targets can’t game the algorithm. In Los Angeles, for example, a “Do's and Don’ts” document for workers in homeless services suggested: “Don’t give a client a copy of the VI-SPDAT. Don’t mention that people will receive a score. [W]e do not want to alert clients [and] render the tool useless.” Sometimes the results of a model are kept secret to protect its targets. Marc Cherna and Erin Dalton don’t want the AFST risk score to become a metric shared with judges or investigating caseworkers, subtly influencing their decision-making.

And too many of us are willing to follow rules that we know hurt people.

Similarly, once you break caseworkers’ duties into discrete and interchangeable tasks, install a ranking algorithm and a Homeless Management Information System, or integrate all your public service information in a data warehouse, it is nearly impossible to reverse course. New hires encourage new sets of skills, attitudes, and competencies. Multimillion-dollar contracts give corporations interests to protect. A score that promises to predict the abuse of children quickly becomes impossible to ignore. Now that the AFST is launched, fear of the consequences of not using it will cement its central and permanent place in the system.

And there it is. It's easier to reverse course with humans.

We all live in the digital poorhouse. We have all always lived in the world we built for the poor. We create a society that has no use for the disabled or the elderly, and then are cast aside when we are hurt or grow old. We measure human worth based only on the ability to earn a wage, and suffer in a world that undervalues care and community. We base our economy on exploiting the labor of racial and ethnic minorities, and watch lasting inequities snuff out human potential. We see the world as inevitably riven by bloody competition and are left unable to recognize the many ways we cooperate and lift each other up.

But only the poor lived in the common dorms of the county poorhouse. Only the poor were put under the diagnostic microscope of scientific charity. Today, we all live among the digital traps we have laid for the destitute.

If we cared more about the poor, we would actually be better at protecting everyone.

As my colleague, Mariella Saba of the Stop LAPD Spying Coalition, always reminds me: it’s vital to keep our eyes on the badge. But the culture of policing wears many uniforms.

And the state doesn’t require a cop to kill a person.

Never has.