Os Keyes on Automatic Recognition systems:

"The point is that there’s no such thing as a tool of measurement that merely “measures.” Any measurement system, once it becomes integrated into infrastructures of power, gatekeeping, and control, fundamentally changes the thing being measured. The system becomes both an opportunity (for those who succeed under it) and a source of harm (for those who fail). And these outcomes become naturalized: we begin to treat how the tool sees reality as reality itself."

So we just add more categories and think deeply about the ethics right?


"So rather than focus on reforming AGR — adding new categories or caveats or consent mechanisms, which are all moves that implicitly accept its deployment — we should push back more generally. We should focus on delegitimizing the technology altogether, ensuring it never gets integrated into society, and that facial recognition as a whole (with its many, many inherent problems) goes the same way. Do not just ask how we resist it — ask the people developing it why we need it. Demand that legislators ban it, organizations stop resourcing it, researchers stop designing it."

@rra I’m pretty sure this technology will misgender cis people as well.

@rra when computers try to measure gender, sometimes they can make a transgirl who's still early in transition smile.

@rra @lertsenem I do not agree with this part : "all technology that measures a thing alters it simply by measuring it". I think the opposite is true most of the time. Is this part trying to say that when a device measures discretely a continuous phenomenon, the loss of information may have an impact on the perception of the phenomenon and that if the aforesaid device is widely used, people may even forget that the phenomenon is continuous in the first place?

Sign in to participate in the conversation

Welcome to, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.