Thursday, December 10, 2020

MIT Technology Review: How our data encodes systematic racism

MIT Technology Review features following opinion piece:
How our data encodes systematic racism | MIT Technology Review Technologists must take responsibility for the toxic ideologies that our data sets and algorithms reflect.

What a propaganda and demagoguery hit piece! Why does MIT Technology Review publish such junk! The author is zealous in her victimhood and victim mentality! Perhaps, these ImageNet-trained models that the author cites, were correct about her? Is this author a follower of black supremacy?

"... Non-white people are not outliers. Globally, we are the norm, and this doesn’t seem to be changing anytime soon. Data sets so specifically built in and for white spaces represent the constructed reality, not the natural one. ...
I’ve often been told, “The data does not lie.” However, that has never been my experience. For me, the data nearly always lies
 ...
Google Image search results for “healthy skin” show only light-skinned women, and a query on “Black girls” still returns pornography. The CelebA face data set has labels of “big nose” and “big lips” that are disproportionately assigned to darker-skinned female faces like mine. ImageNet-trained models label me a “bad person,” a “drug addict,” or a “failure.” ..."

No comments: