Help us make these practical and urgent recommendations for inclusive algorithms a reality.
We call on Governments, Private Sector, and Civil Society Organizations to:
1
Advocate for and adopt guidelines that establish accountability and transparency for algorithmic decision making (ADM) in both the public and private sectors.
2
Take clear proactive steps to include an intersectional variety and equal numbers of women and girls in the creation, design, and coding of ADM.
New technologies offer new opportunities including the creation of genuinely new structures that require new ideas and new teams. Gender roles being removed from the real world are being baked into new ADM with old and stereotypical conceptions and associations of gender, race, and class. Innovative and inclusive thinking are necessary. The imagination and skill can be provided by the largest untapped intellectual resource on the planet – women and girls.
Gender balance in AI decision making: Gender balance in decision making should be put on the official agenda of all involved with the funding, design, adoption, and evaluation of ADM.
Gender balance in design teams: Employment of a robust range of intersectional feminists in the design of ADM systems that will trigger and assist greater innovation and creativity, as well detection and mitigation of bias and harmful effects on women, girls, and those traditionally excluded.
Require companies to proactively disclose and report on gender balance in research and design teams, including upstream when applying for grants. Incentivize teams that are balanced and multi-disciplinary.
Research fund: Create a research fund to explore the impacts of gender and AI, machine learning, bias and fairness, with a multi-disciplinary approach beyond the computer science and economic lens to include new ways of embedding digital literacy, and study the economic, political, and social effects of ADM on the lives of women and those traditionally excluded from rules making and decision taking.
3
International cooperation and an approach to ADM and machine learning grounded in human rights.
Mass scale correction of skewed data will require multilateral and international cooperation to ensure we leave no one behind.
A UN agencies-wide review of the application of existing international human rights laws and standards for ADM, machine learning, and gender: This can guide and provoke the creative thinking for an approach grounded in human rights that is fit for purpose in the fast changing digital age.
Development of a set of metrics for digital inclusiveness: To be urgently agreed, measured worldwide, and detailed with sex disaggregated data in the annual reports of institutions such as the UN, the International Monetary Fund, the International Telecommunications Union, the World Bank, and other multilateral development banks, and the OECD.