Sunday, October 04, 2020

Study indicates neither algorithmic differences nor diverse data sets solve facial recognition bias

When you combine phony journalism and hot air science research you get an article and paper like that!
Unfortunately, there are tons of published articles like that suggesting some kind of bias in computer vision! Actually this is more about the mental bias and narrow mindedness of journalists and researchers than that of algorithms!

"The researchers focused on three models — VGG, ResNet, and InceptionNet — that were pretrained on 1.2 million images from the open source ImageNet dataset."
The famous Imagenet dataset was never designed for ethnic group classification of faces! Imagenet is a general purpose dataset!

Study indicates neither algorithmic differences nor diverse data sets solve facial recognition bias | VentureBeat

Understanding Fairness of Gender Classification Algorithms Across Gender-Race Groups presented at 19th IEEE International Conference On Machine Learning And Applications 2020

No comments: