Repeat after me: An AI is only as fair as the data it learns from. And all too often, the world is unfair with its myriad gender, race, class, and caste biases. Now, fearing one such bias might affect users and rub them the wrong way, Google has blocked Gmail’s Smart Compose AI tool from suggesting gender-based pronouns. Introduced in May, Gmail’s Smart Compose feature allowed users to automatically complete sentences as they typed them. You can see an example below, with Gmail’s suggested words in gray. If you still haven’t enabled the feature on your Gmail, you can learn…
This story continues at The Next Web
Or just read more coverage about: Gmail,Google
from The Next Web https://ift.tt/2r6a5zp
Comments
Post a Comment