Google is using a new way to measure skin tones to make search results more inclusive

0
1
Google is using a new way to measure skin tones to make search results more inclusive

Google is teaming up with a Harvard professor to promote a new scale for measuring skin tone in hopes of addressing issues of bias and diversity in the company’s products.

The tech giant is working with Ellis Monk, assistant professor of sociology at Harvard and creator of the Monk Skin Tone Scale, or MST. The MST scale is designed to replace outdated skin tone scales that are biased toward lighter skin. When these older scales are used by tech companies to categorize skin color, it can lead to underperforming products for people with darker skin tone, Monk says.

“Unless we have an adequate measure of differences in skin tone, we can’t really build that into products to make sure they’re more inclusive,” says Monk. The rod. “The Monk Skin Tone Scale is a 10-point skin tone scale that has been deliberately designed to be much more representative and to include a wider range of different skin tones, especially for people [with] darker skin tones. »

There are many examples of tech products, especially those using AI, that work less well with darker skin tones. These include: apps designed to detect skin cancer, facial recognition software, and even machine vision systems used by self-driving cars.

While there are many ways to program this type of bias into these systems, one common factor is the use of outdated skin tone scales when collecting training data. The most popular skin tone scale is the Fitzpatrick scale, which is widely used in academia and AI. This scale was originally designed in the 1970s to classify how people with paler skin burn or tan in the sun and was only later expanded to include darker skin.

This has led to some criticism that the Fitzpatrick scale fails to capture a full range of skin tones and may mean that when machine vision software is trained on Fitzpatrick data it is also biased towards types of skin tones. clearer skin.


The 10-point Monk skin tone scale.
Image: Ellis Monk/Google

The Fitzpatrick scale is made up of six categories, but the MST scale extends it to 10 different skin tones. Monk says this number was chosen based on his own research to balance diversity and ease of use. Some skin tone scales offer more than a hundred different categories, he says, but too many choices can lead to inconsistent results.

“Usually if you’ve exceeded 10 or 12 points on these types of scales [and] ask the same person to repeatedly choose the same tones, the more you increase this scale, the less people are able to do this,” says Monk. “Cognitively speaking, it becomes really difficult to differentiate accurately and reliably. A choice of 10 skin tones is much more manageable, he says.

Creating a new skin tone scale is only the first step, however, and the real challenge is integrating this work into real-world applications. To promote the MST scale, Google has created a new website, skintone.google, dedicated to explaining research and best practices for its use in AI. The company says it is also working to apply the MST scale to a number of its own products. These include its “Real Tone” photo filters, which are designed to work better with darker skin tones, and its image search results.

refinement static

Google will allow users to refine certain search results using selected skin tones from the MST scale.
Picture: Google

Google says it’s introducing a new image search feature that will allow users to narrow searches based on skin tones categorized by the MST scale. So, for example, if you search for “eye makeup” or “bridal makeup look,” you can then filter the results by skin tone. In the future, the company also plans to use the MST scale to check the diversity of its results so that if you search for images of “cute babies” or “doctors”, you won’t just see white faces.

“One of the things we do is take a set of [image] results, understand when those results are particularly consistent across a few tones, and improve the diversity of results,” said Tulsee Doshi, Google Product Manager for Responsible AI. The rod. Doshi stressed, however, that these updates are in a “very early” stage of development and have not yet been rolled out to the company’s services.

That should be cautious, not just for this specific change, but for Google’s approach to addressing bias issues in its products in general. The company has a patchy history when it comes to these issues, and the AI ​​industry as a whole tends to promise ethical guidelines and safeguards and then fail on follow-through.

Take, for example, the infamous Google Photos error that led to its search algorithm tagging photos of Black people as “gorillas” and “chimpanzees.” This error was first noticed in 2015, but Google confirmed to The rod this week that it still hasn’t fixed the problem but just deleted those search terms. “While we’ve improved our templates significantly based on feedback, they’re still not perfect,” Google Photos spokesperson Michael Marconi said. The rod. “In order to avoid this type of error and potentially cause additional damage, the search terms remain disabled. »

Introducing these kinds of changes can also be culturally and politically difficult, reflecting broader difficulties in how we integrate this kind of technology into society. In the case of filtering image search results, for example, Doshi notes that “diversity” can look different in different countries, and if Google adjusts image results based on skin tone, it can be necessary to modify these results according to the geography.

“What diversity means, for example, when we show results in India [or] when we show results in different parts of the world, they will be inherently different,” says Doshi. “It’s hard to necessarily say, ‘oh, this is the exact set of good results we want,’ because it will differ per user, per region, per query. »

Introducing a new, more inclusive scale for measuring skin tones is a step forward, but much trickier questions involving AI and bias remain.


Related:

LEAVE A REPLY

Please enter your comment!
Please enter your name here