On Feb 21, our CEO/Founder Caitlin Kraft-Buchman participated at Global Digital Compact., hosted by Finland & Generation Equality Action Coalition Technology & Innovation for Gender Equality.
Remarks by Caitlin Kraft-Buchman
CEO / Founder, Women At The Table / <A+> Alliance for Inclusive Algorithms
Many thanks for inviting us here today and hosting this important conversation. Women At The Table and the <A+> Alliance for Inclusive Algorithms embrace the positive multi-purpose uses of AI and New Emerging Technologies, but also acknowledge the unintended consequences of all technologies, from the clock to the printing press, the cotton gin to the nuclear bomb. To quote Marshal McLuhan: “We shape our tools, thereafter our tools shape us”.
And so, we must register a loud alarm regarding algorithms wiring historic bias, inequality and discrimination into our new economic, governance, and social systems.
It is widely known that most algorithms are based on incomplete, corrupt or biased data; machines learn on and codify these ingrained bias patterns; and neural networks which detect even more complex patterns, proxy old stereotypical assumptions – and thus deeply entrench inequalities at global scale, with extraordinary speed. All of which is a threat to peace and stability.
Efforts to remove, or more accurately suppress the algorithmic bias –because machine learning cannot unlearn bias, (it can only overwhelm with more information)– when successful only happen per language. So basically, English.
That means that lower resourced languages, (which here would include all official languages of the UN) – not to speak of Hindi, Swahili, Yoruba, Bengali, or Thai will still have Gender roles being removed ‘in real life’ wired into new technology with old associations of gender, race, class and caste, embedded in the code. This amounts to the worst of an old world order being reasserted.
Therefore, we recommend:
- A stand alone portion on gender and women & girls in the Global Digital Compact (GDC).
- The creation of new datasets that focus not only on quantity, but on quality, for public use with the UN leading construction.
- A mandate for use of Algorithmic Impact Assessments that include Human Rights Impact Assessment, both ex-ante and post-facto.
- The requirement of human involvement in all automation systems – if there is a risk of discrimination or bias.
- The right to know if algorithmic decisions have been made affecting persons… including the consent and contestability of those systems.
- Build of capacity through mandatory training for any individual who uses AI-informed decision-making in their work, or develops or inputs data into automation systems.
- The use of public procurement (which is more that 30% of GDP) to jumpstart new industries and train, expanding definitions of what ‘expertise’ is so that social scientists and those with lived experience can influence the design and deployment of new tech.
- A meaningful seat for civil society at the table.
- The creation of a research fund to explore impacts of gender and AI on the lives of women and all those traditionally excluded from rules making and decision taking.