Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

But presumably the AI is just identifying faces, not judging who is and is not a criminal


If the training set is biased, the AI becomes biased. GIGO.


When this software is being sold to these departments, it's amazing that people in the chain don't seem to be talking enough about the training set used or performance on certain populations. If you are going to arrest or build a case on facial recognition, you would think that they would be prepared to defend its accuracy against a broad range of demographics. Embarrassing failures and mistaken arrests, hurts their program, not to mention the money the city losses in lawsuits.


The answer to this conundrum might be that neither the departments nor the vendors are particularly interested in avoiding bias. Paying lip service is generally sufficient.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: