AI is increasingly a feature of everyday life. But with its models based on often outdated data and the field still dominated by male researchers, as its influence on society grows it is also perpetuating sexist stereotypes.
AI is increasingly a feature of everyday life. But with its models based on often outdated data and the field still dominated by male researchers, as its influence on society grows it is also perpetuating sexist stereotypes.
As you might expect, the answer is no, but the data it’s trained on probably is.
Relatedly, remember last year when Google told Gemini to make generated people racially diverse, so it started making black nazis, black popes, and Asian Vikings?
And this is why anyone who suggests that machine learning systems can be used to make social decisions of any kind should be laughed out of the room. Even when the system programs itself, its goals are set by people and it is trained on data generated and selected by people.