Ph.D. student Colin Madland was busy exposing a potentially racist feature on Zoom when he may have stumbled upon some additional technological racism in Twitter’s new “smart auto-cropping” algorithm for image previews on the mobile app. Madland posted an image showing how Zoom kept removing the head of a Black faculty member when Twitter cropped the image so that the preview only showed his own white face.
When Madland tried flipping the image to see if Twitter simply preferred to focus on faces on the right side of images, it again focused on his face, leaving out the Black man.
This issue with Twitter’s auto-cropping feature appears to be part of a larger problem with new technology that doesn’t work quite as well with dark skin. Facial recognition programs have already been shown to have issues with Black faces, and even older tech like auto-soap dispensers have trouble recognizing that Black skin is indeed skin that needs soaping.
As Madland points out, this issue has already led to at least one wrongful arrest and imprisonment.
Curious to see whether Madland’s experiment was just a fluke, other Twitter users were soon trying out their own photos using Black and white public figures, stock photo models, cartoon characters, and even dogs.
In the battle between Barack Obama and Mitch McConnell, Twitter preferred McConnell’s face in both images that used a long vertical format.
The same thing happened when the men changed their ties.
Even in an image with two Black stock photo models and one white, Twitter ends up picking the white guy for the preview.
Although there have been many doubters saying that things like tie or background color must be to blame for this phenomenon, people are showing time and time again that Twitter seems to have a serious preference for white skin.
One of these was even Twitter’s own chief design officer Dantley Davis.
At least we know that Twitter is now aware of the problem. Whether they’ll do anything about it is another story.
Although the algorithm clearly has a skin color preference, it doesn’t choose the white individual 100 percent of the time. Some Twitter users have discovered strange anomalies in the system, such as a possible preference for people who wear glasses.
Some are blaming contrast and edge detection for the apparent racial preferences in the cropping algorithm, showing that these programs are much better at detecting edges on white faces than on Black. Of course, that’s just a whole other problem that all facial recognition technology is going to have to deal with as it becomes increasingly popular.
In response, Twitter has promised to do more analysis into its tech as well as allow others outside of the company to look into the problem.
“Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing,” said Twitter spokesperson Liz Kelley to Junkee. “But it’s clear from these examples that we’ve got more analysis to do. We’ll open source our work so others can review and replicate.”