Tech News

Ethics and artificial intelligence: women and LGBT+ people are a minority in AI teams

When developing algorithms it is important to contemplate that there is diversity in the data that feeds these systems. Otherwise, the artificial intelligence developed from that information could replicate and enhance biases. Hence, it is increasingly important to take into account the ethical aspect in the generation of this form of technology.

The point is that although today companies are aware of this situation, certain obstacles and limitations have not yet been overcome to achieve more inclusive technologies.

According to a recent study by IBM, conducted in conjunction with Oxford Economics, 68% of organizations surveyed acknowledge that having a diverse and inclusive workplace is important to mitigating bias in AI.

Nevertheless, Findings indicate that AI teams remain substantially less diverse than their organizations’ workforce: 5.5 times less inclusive for women, 4 times less inclusive for people from the LGBT+ community, among others.

As many companies today use AI algorithms in their business, they face increasing internal and external demands to design these algorithms to be fair, secure, and trustworthy; however, there has been little progress in the industry in incorporating AI ethics into its practices,” said Jesús Mantas, Global Managing Partner at IBM Consulting.

In the report, which is based on the survey of 1,200 executives in 22 countries in 22 industries, the reason that leads companies to include less diversity in AI teams is not detailed; but this information can be analyzed in the light of other data related to this topic.

According to a unesco report, Only 33% of women in higher education in the world choose scientific and technological careers. The international organization specified that only 3% of female students who choose to pursue higher education choose technology, information and communications; 5% choose natural sciences, mathematics and statistics.

While 8% of the students opt for engineering, manufacturing and construction; and 15% choose careers related to health and wellness, such as medicine or nursing.

There are numerous obstacles associated with these educational paths, from the stereotypes girls face to family responsibilities and prejudice that women face when choosing their field of study”, UNESCO highlights in said report.

Stereotypes are built through advertisements, publications, stories that are repeated in society and in the media. Different researchers have pointed out that the advent of the personal computer in the 1980s and its promotion, from advertisements, as a product related to the world of men created models of exclusion for women.

Marketing studies conducted during the 1980s and 1990s found that boys were more likely to be encouraged by their teachers and family members to study math, science, and technology. This world was linked to that segment and thus a story was created that said that computing or technology “was a thing for men”, which was causing fewer and fewer women and other communities to devote themselves to the study of these disciplines.

Furthermore, a study conducted by researchers at the University of Michigan and Philadelphia concluded that LGBT+ professionals within STEM (science, technology, engineering, and math) disciplines were more likely to experience professional limitations, harassment, and professional devaluation than their non-LGBTQ peers. They also reported that they were more likely to intend to leave STEM disciplines, as a result of this situation.

The stereotypes we consume create a horizon of possibilities and determine what is expected of each gender. From these images we define how we perceive others and ourselves, and they even predispose us to adopt certain attitudes and make certain decisions”, analyzed Laura Mangifesta, a communicator specialized in Technology, in this article where she analyzes the gender gap in technology.

“One of the causes of algorithmic bias is the lack of diversity in the teams that develop AI and the other reason is the lack of training of those teams in gender issues. Many times it is not enough to add more diversity but it is also necessary to train these developers so that they are aware of this situation and have a more inclusive approach”, analyzes Cecilia Danesi, professor and researcher in Artificial Intelligence and Law, in dialogue with TechMarkup.

And he adds: “It is important to highlight that training in social sciences, ethics and diversity should be included in careers linked to the development of this type of technology because algorithms impact the life of society in multiple aspects.”

What to do about this situation

The first step to change is to account for the unequal situation that exists in AI teams, and that is something that almost 7 of the 10 companies surveyed already recognized, as mentioned above. At the same time, 88% of Latin American leaders recognize the importance of ethical AI.

The next steps are take concrete initiatives to add more diversity to those teams that develop algorithms that can condition different aspects of life. In relation to this, the IBM study recommends some actions for business leaders, including the following:

1. Take a multidisciplinary and collaborative approach: ethical AI requires a holistic approach and comprehensive skill set from all parties involved in the process. C-suite executives, designers, behavioral scientists, data scientists, and AI engineers each have a different role to play on the journey to trustworthy AI.

2. Establish AI governance to operationalize AI ethics: take a holistic approach to incentivize, manage and govern AI solutions throughout their lifecycle, from establishing the right culture to foster AI responsibly, to practices and policies for products.

3. Reach beyond the company itself to partner: broaden the approach by identifying and engaging key AI-focused technology partners, academics, startups, and other ecosystem partners to establish “ethical interoperability.”

:

Tags

Related Articles

Check Also
Close
Back to top button