There’s no excuse for racist app filters. This hasn’t stopped Snapchat from repeatedly playing dumb by launching “anime” and “Bob Marley,” “white Frida Kahlo,” and “blue-eyed Rosa Parks” filters. (As has been pointed out, even Snapchat’s perennial fave flower crown filter is problematic.) But FaceApp, fresh off the back of a “hotness” filter that basically just lightened skin tone, isn’t even trying to pass of its latest feature as an accident. The new update is literally just an ethnicity change filters. They’re as terrible as you’d expect, and are being received in an equally non-surprising manner.
The FaceApp 2.0 update now gives users the option of applying Black, Indian, Asian, and Caucasian filters to their selfies. FaceApp CEO Yaroslav Goncharov told TechCrunch that the filters were “designed to be equal in all aspects,” which sounds a bit like saying that the filters aren’t problematic because they’re equally racist. “They don’t have any positive or negative connotations associated with them,” Goncharov continued. “They are even represented by the same icon. In addition to that, the list of those filters is shuffled for every photo, so each user sees them in a different order.” Unfortunately, blackface in a different grid position is still blackface.
Filters are just one example of facial recognition software’s race problem. In a roundtable discussion for Lenny Letter last year, tech-policy expert Meena Harris quizzed black female engineers on diversity in Silicon Valley start-ups, and the importance of improving it. Slack engineer Erica Baker used the example of facial-recognition software in cameras, and its failure to recognize black people, as an example of how the final product can be imperfect if it’s only tested on white employees. Sometimes AI algorithms need to be checked for implicit bias. Other times, you should just scrap an idea before you even begin. Luckily public outcry can help minimize the damage. Goncharov has now told Engadget that the new feature will be “removed in the next few hours.”