When this software is being sold to these departments, it's amazing that people in the chain don't seem to be talking enough about the training set used or performance on certain populations. If you are going to arrest or build a case on facial recognition, you would think that they would be prepared to defend its accuracy against a broad range of demographics. Embarrassing failures and mistaken arrests, hurts their program, not to mention the money the city losses in lawsuits.
The answer to this conundrum might be that neither the departments nor the vendors are particularly interested in avoiding bias. Paying lip service is generally sufficient.