Test Page

The world is increasingly ruled, not by governments or by individuals, but by computer algorithms that are informed by huge amounts of data. Almost every question can be answered, purchase can be made, or date can be planned online, driven by data and complex technologies; in short: by AI. And this technology is increasingly involved in assessing who gets to vote, who gets credit, and who is insured, and who is healed.

But there are serious inequalities in the data and algorithms underlying these technologies. Face recognition, for example, which is used in a variety of law enforcement and medical scenarios, does not work equally well for everyone. White men have a high chance of being correctly identified; women and people with a skin tone other than Caucasian have a much lesser change of being recognized correctly, and women with dark skin have the lowest recognition of all. To assess the risk of recidivism, law enforcement uses software that gauges the risks for light-skinned people committing a new violent crime as significantly lower than those of non-white perpetrators. And software used to detect melanomas or skin cancers has a significantly lower chance of identifying skin cancer on darker than on lighter skins.

These biases have different origins: an important issue is that the datasets that train the software on is not representative of the global population. As Joy Buolamwini has shown, this leads to a skewed ability to correctly recognize women, and specifically black women’s faces. Secondly, there are biases in the algorithms themselves. Because of the work done mostly on light-skinned patients, melanoma detection software is not trained to detect skin cancer on darker skin. And importantly, underlying cultural or humanitarian assumptions in the software leads to misrepresentation of categories of people. Asking people for their ethnicities and offering four options leads to large groups of people being miscategorized and not counted. Asking to decide between two genders, similarly, does not allow people who are transgender or non-binary to be identified as such.

It is clear that change is needed. We need to radically diversify the group of people who make the software and gather the data that drives our interactions with commerce and society. This means that we need new people doing new research, who provide a different viewpoint to start from. African doctors working with African AI to detect melanomas in their patient population, would be a great start. We need people who are not men, to build software for users that are not assumed to be (only) male.  We need more scientists and more companies building more tools who make diverse contributions to the global economy.

Elsevier has been involved a number of efforts in Africa to help support such change including providing free and low cost access to our medical and scientific content through Research4Life, a UN-publisher partnership. We’ve also been proud to provide key expertise and infrastructure support to NEF’s Scientific African Journal launched in 2018 to create an African-driven, large scale open access journal.  And through the Elsevier Foundation we support many research and health projects including Amref’s Innovate for Life health tech solutions accelerator and MSF/Epicentre’s African research center in Niger. But one of the partnerships we are proudest of celebrates talented women scientists in the developing world, such as Dr. Chao Mbogo, 2020 winner of the OWSD Elsevier Foundation award for early career women scientists in developing countries. Dr. Mbogo, a computer scientist, Dean and lecturer at Kenya Methodist University, was celebrated for her work in designing techniques to support students in resource-constrained environments to learn computer programming using mobile devices. Her research finds ways to circumvent limitations such as small screens and small keypads, that make it difficult for students to use mobile phones for programming in areas where computers are not easily available. In Dr. Mbogo’s words, “[…] designing technological tools that support learners is important and timely work, especially for students in developing countries who may not have much access to information or opportunities. This award has acted as a strong reminder to me to never stop holding the ladder up for others.”

In short, the next Einstein could be someone from the African continent who finds a way to detect bias in systems and corrects it. As we move towards a global AI-driven economy, we hope that our support of the Next Einstein Forum and initiatives like it can help make the world we live in a more equitable and, ultimately, a more interesting one.