The ‘CEO’ is a man: How Chinese artificial intelligence perpetuates gender biases

The ‘CEO’ is a man: How Chinese artificial intelligence perpetuates gender biases

Unsplash

As artificial intelligence becomes increasingly integrated into modern society, evidence suggests that the algorithms meant to help eliminate cultural biases have problems of their own, and Chinese companies are no different.

A report published on Monday (Oct 1) by the Mana Data Foundation, a Shanghai-based public welfare foundation, and UN Women, found systematic prejudices against women in many programmes.

For example, on major Chinese search engines like Baidu, Sogou and 360, words like “engineer”, “CEO” or “scientist” returned mostly images of men.

Furthermore, searching keywords like “women” or “feminine” often resulted in derogatory videos and photos, or links to content such as “a women’s sexual techniques” and information about vaginas.

The report’s purpose was to provide concrete evidence for gender discrimination in AI algorithms so that companies learn about the problem and fix it.

The report provided gender discrimination cases in new media, search engines, open-source coding, employment algorithms and consumption models.

Globally, artificial intelligence is often accused of perpetrating cultural biases. For example, US authorities have used artificial intelligence to predict recidivism rates, or the likelihood a convicted criminal would reoffend.

When Pro Publica analysed a product called COMPAS, it found that black people were twice as likely to be labelled “high risk” as white people with the same criminal background.

The Chinese report found that deep learning algorithms often missed offensive content directed towards women, such as phrases like “besides giving birth, women are useless”, but were good at catching pornography and violent imagery.

Online advertisers also use algorithms to try and target their campaigns to boost sales, but they can easily objectify women. On an unnamed e-commerce platform, searching for beer products led to an advertisement from a beer company that featured a semi-pornographic picture of a woman, the report said.

Kuang Kun, an expert on the project, said: “Gender discrimination exists in algorithms because the data collected to train AI reflects the discrimination that exists in the human world, and because algorithm engineers lack awareness, they do not include solutions.”

A survey in the report found that 58 per cent of respondents working in AI-related fields did not know gender discrimination existed in algorithms, and 80 per cent did not know how to solve the problem.

However, change is possible. In one case, Baidu, the largest search engine in China, was able to link information to fight domestic violence to 11 keywords and phrases typically searched amid an abusive relationship.

The company also changed 176 anti-women job descriptions during an internal clean-up.

The report said Chinese society should provide more education opportunities for women because there is occupational gender segregation and added that companies should provide equal training and promotion opportunities between men and women.

In 2019, 89.4 per cent of all computer programmers were men, compared to 10.4 per cent for women, according to the China Internet Information Centre.

As for the algorithms themselves, the report said companies need to reduce developer biases and be transparent about what the algorithm does and how it uses the data.

Companies can also create mechanisms like feedback channels, and they can use a human instead of an algorithm when making an important decision.

In 2018, a paper from MIT and Stanford University also examined race and gender in facial recognition technology.

The team examined three commercially released facial-analysis programmes and found an error rate of 0.8 per cent for light-skinned men versus 34.7 per cent for dark-skinned women.

In April 2020, Google announced that it was dedicating energy towards fixing gender biases in its translation tool.

For example, the tool had incorrectly identified Marie Curie as a man during the translation process. Google is building a data set to help improve Translate’s machine learning regarding gender biases.

What’s your Reaction?
+1
0
+1
0
+1
0
+1
0
+1
0
+1
0

Share this post

Leave a Reply

Your email address will not be published. Required fields are marked *