AI Attribute Tracking Sidesteps Facial Recognition Limits, Raises Privacy Concerns

AI Attribute Tracking Sidesteps Facial Recognition Limits, Raises Privacy Concerns

Photo by ThisIsEngineering on Pexels

As facial recognition technology faces increasing restrictions, law enforcement agencies are turning to AI-powered attribute tracking. These systems monitor individuals based on characteristics like body size, gender, hair color, and clothing, allowing surveillance without directly using biometric data. The ACLU has identified the technology as the first of its kind used at scale in the U.S., raising significant concerns about potential misuse, particularly with increased calls for monitoring protesters and immigrants.

The rapid adoption of these AI tools by over 18,000 independent police departments highlights the urgent need for regulation and transparency. Companies offer integrated sensor and AI solutions, blurring the lines between efficient policing and pervasive surveillance. Community pushback, as seen in Chula Vista, California, where drone footage was disproportionately used in low-income neighborhoods, underscores these privacy concerns.

ACLU’s Jay Stanley advocates for public hearings, community approval, and transparency before implementing such technologies. He warns against regulatory loopholes that allow departments or technology vendors to circumvent restrictions and urges communities to demand detailed information about how this technology will be deployed. Stanley further emphasized the importance of independent testing of the AI. The Algorithm newsletter originally reported this story, highlighting the need for policymakers and the public to establish clear boundaries for police powers in the age of artificial intelligence.