Since the creation of the camera, photography has been technologically optimized to capture white people best. Engineers at Google are trying to change that.
At Google's developer conference, Google I/O, Tuesday, the company announced that it's working to re-work the algorithms and tweak the training data that power the Pixel camera in order to more accurately and brilliantly capture people of color.
Specifically, it is working to better light people with darker skin and more accurately represent skin tone. Also, silhouettes of people with wavy or curly hair will stand out more sharply from the background.
Tweet may have been deleted
Google isn't the only company having a technological reckoning with racial bias. Just last month, Snap announced it was re-working its camera software to better represent people of color.
Google is calling its project "Image Equity." Like Snap, the company worked with outside experts in photography and representation to guide the undertaking.
Some of the changes will involve training the algorithms that render the photos on a more diverse dataset, so white people and white skin aren't the default definition of "person." Google will also be tweaking the Pixel's auto white-balance and auto-exposure capabilities to better optimize for people with darker skin.
TopicsActivismCamerasRacial Justice