Abstract
This paper contributes to the existing research in the area of digital sustainability and social inclusion with a focus on gender diverse minority groups in society. The aim of this paper is to shed light on the misrepresentation, misinterpretation and mislabeling of gender and sex data in the data sets used to fuel decision making systems that impact society. This paper focuses on the diversity of gender outside of the cisgendered binary male/female and how machine learning algorithms act as vehicle to reignite negative bias towards the LGBTQI+ community even though civil rights have moved forward. A virtual focus group and an online survey questionnaire was used to carry out this research. This paper highlights that gender is personal and a sense of self and it cannot be algorithmically identified by mining data on the internet and in doing so violates the recognition and human rights that the LGBTQI+ community have obtained in society. The findings also suggest that there is a gender diversity gap beyond cisgendered females and males working in IT and AI roles. This paper contributes to the IFAC discourse on diversity and inclusion and structural marginalization in science and engineering.
Original language | English |
---|---|
Pages (from-to) | 117-122 |
Number of pages | 6 |
Journal | IFAC-PapersOnLine |
Volume | 55 |
Issue number | 39 |
DOIs | |
Publication status | Published - 01 Oct 2022 |
Event | 21st IFAC Conference on Technology, Culture and International Stability, TECIS 2022 - Kosovo Duration: 26 Oct 2022 → 28 Oct 2022 |
Keywords
- algorithmic bias
- artificial intelligence
- digital sustainability
- genderqueer
- social inclusion