An algorithm that grants freedom, or takes it away

An algorithm that grants freedom, or takes it away

SFGate

Published

PHILADELPHIA — Darnell Gates sat at a long table in a downtown Philadelphia office building. He wore a black T-shirt with “California” in bright yellow letters on the chest. He had never been to the state, but he hoped to visit family there after finishing his probation.

When Gates was released from jail in 2018 — he had served time for running a car into a house in 2013 and later for violently threatening his former domestic partner — he was required to visit a probation office once a week after he had been deemed “high risk.”

He called the visits his “tail” and his “leash.” Eventually, his leash was stretched to every two weeks. Later, it became a month. Gates wasn’t told why. He complained that conversations with his probation officers were cold and impersonal. They rarely took the time to understand his rehabilitation.

He didn’t realize that an algorithm had tagged him high risk until he was told about it during an interview with the New York Times.

“What do you mean?” Gates, 30, asked. “You mean to tell me I’m dealing with all this because of a computer?”

In Philadelphia, an algorithm created by a University of Pennsylvania professor has helped dictate the experience of probationers for at least five years.

The algorithm is one of many making decisions about people’s lives in the United States and Europe. Authorities use so-called predictive algorithms to set police patrols, prison sentences and probation rules. In the Netherlands, an algorithm flagged welfare fraud risks. A British city rates which teenagers are most likely to become criminals.

Nearly every state has turned to this new sort of governance algorithm, according to the Electronic Privacy Information Center, a nonprofit dedicated to digital rights. Algorithm...

Full Article