News Civil Rights Group Wants to Ban Feds From Using Facial Recognition

"The FTTF said facial recognition software tends to be inaccurate, especially with people of color, women and children, putting these categories at higher risk of harassment, wrongful arrest, or worse."

Well that's interesting. So "people of color, women, and children" are more prone to misidentification than white adult males here. I deduce that is the conclusion of the geniuses behind the group complaining here. I actually agree with them that leaving our civil liberties to government programs is asking for trouble, but to say that pattern recognition ONLY has the potential to misidentify those demographics they mention just nulls and voids any credibility they have. Then again I have to remind myself these "civil rights" groups are the first to support and vote for politicians who approve of these ever-increasing 1984 Orwellian technologies.
 

USAFRet

Titan
Moderator
"The FTTF said facial recognition software tends to be inaccurate, especially with people of color, women and children, putting these categories at higher risk of harassment, wrongful arrest, or worse."

Well that's interesting. So "people of color, women, and children" are more prone to misidentification than white adult males here. I deduce that is the conclusion of the geniuses behind the group complaining here. I actually agree with them that leaving our civil liberties to government programs is asking for trouble, but to say that pattern recognition ONLY has the potential to misidentify those demographics they mention just nulls and voids any credibility they have. Then again I have to remind myself these "civil rights" groups are the first to support and vote for politicians who approve of these ever-increasing 1984 Orwellian technologies.
There have been proven instances of facial recognition screwing up when presented with non-white faces.

https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html
https://www.govtech.com/products/Bi...cognition-Microsoft-Hopes-to-Change-That.html
https://www.forbes.com/sites/mzhang...ugh-facial-recognition-software/#2a5b2a0f713d
 
Feb 14, 2019
40
12
35
But, I believe what I've read, is that the issue has more to do with the data they use (ie photo's/pictures etc) to train the algorithms. They use considerably more data from white people, they just need to use more women/dark skinned people. My questin would be is an Italian (alot are dark skined) going to have same issue?
Also, most of this is for first check, eventually you would need to compare the faces manually.
 

USAFRet

Titan
Moderator
Regardless of why it happens...it apparently does happen.

In development, you need to construct and test for a LOT of edge cases.
If your dev team and testing sample is all white guys, it may work perfectly, because that's all you train it and test it for. You might not even think it would have issues with other demographics.

They're not doing it on purpose. Simply that they don't even think about it. The thought is foreign.

This happens a lot with monoculture teams.
A team of all women might build something they think is great. But when presented with a user that is a male from a completely different culture, he might think it is the dumbest application in the world.
Or whatever.

Too often, things are not tested in a wide enough range. Be it facial recognition, tiny buttons on a camera, instructions for assembling furniture, whatever.
 
Feb 14, 2019
40
12
35
Regardless of why it happens...it apparently does happen.

In development, you need to construct and test for a LOT of edge cases.
If your dev team and testing sample is all white guys, it may work perfectly, because that's all you train it and test it for. You might not even think it would have issues with other demographics.

They're not doing it on purpose. Simply that they don't even think about it. The thought is foreign.

This happens a lot with monoculture teams.
A team of all women might build something they think is great. But when presented with a user that is a male from a completely different culture, he might think it is the dumbest application in the world.
Or whatever.

Too often, things are not tested in a wide enough range. Be it facial recognition, tiny buttons on a camera, instructions for assembling furniture, whatever.
Agree, but I believe what these groups were implying is that they were intentionally being racist/sexist etc. At least I've seen those implications in other articles. That's more the issue I have. Being somewhat sloppy with the oveall big picture is an easily solvable issue.
 

USAFRet

Titan
Moderator
Agree, but I believe what these groups were implying is that they were intentionally being racist/sexist etc. At least I've seen those implications in other articles. That's more the issue I have. Being somewhat sloppy with the oveall big picture is an easily solvable issue.
People say or imply all sorts of things, to promote their specific agenda. Whether it is real or not.
 

Math Geek

Titan
Ambassador
i laughed when i read a recent article where the author was SHOCKED that all the dmv id pics are handed over to the fed's facial recognition database.

i wonder why people think a few years ago the various dmv agencies changed how a picture could be taken. no smiling, head covering and all that stuff. it was so the database would be easier to do its thing with the pics when added.

if you want to get away from the surveillance, you'll have to move to the remote congo or amazon rain forest. might still be a couple years before there are even cameras deep in them jungles...