Artificial intelligence has a gender-bias problem - just ask Siri
If you would like to obtain a copy of this Research Output, please contact Hanlie Baudin at firstname.lastname@example.org.
All the virtual personal assistants on the market today come with a default female voice and are programmed to respond to all kinds of suggestive questions. Does their design as stereotyped females suggest that in the midst of a global technological revolution, women remain trapped in traditional roles and personalities of the past?
Related Research Outputs:
- Whose right it is anyway? equality, culture and conflicts of rights in South Africa
- Gender inequality persists in artisan employment in South Africa
- Social impact assessment of development projects
- Book review: Bradby, H. & Hundt, G.L. (eds). 2010. Global perspectives on war, gender and health: the sociology and anthropology of suffering. Surrey: Ashgate Publishing. 157 p. ISBN 9780754675235
- Gender equality and Curriculum 2005
- Men, take a stand
- The great leap sideways: gender, culture and rights after 10 years of demcracy in South Africa
- Gender equity in South African education 1994-2004: conference proceedings
- Correcting gender inequalities is central to controlling HIV/AIDS
- Work value change in South Africa between 1995 and 2001: race, gender and occupations compared
- Education and health services (including HIV/AIDS and gender)
- A class act - mathematics as filter of equity in South Africa's schools
- Gender and HIV vaccine trials: ethics and social science issues
- The role of the chapter 9 institutions in the promotion and protection of gender equality in South Africa
- Teacher education and the challenge of diversity in South Africa
- Masculine bodies, feminine symbols: challenging gendered identities or compulsory femininity?
- HIV/AIDS and 'othering' in South Africa: the blame goes on
- Gender mainstreaming: a research ethics issue?
- Gender, development and transport in rural South Africa: methodological, policy and implementation challenges