“Algorithms of Oppression: How Search Engines Reinforce Racism” by Safiya Umoja Noble deals with how the algorithms used by sites like Google, Yelp, etc. shape and reinforce race and gender relations. The book focuses on Google’s search engine and the algorithm that it uses. Still, Dr Noble makes it apparent that the same problems arise in other uses of algorithms. Dr Noble is an Associate Professor at the University of California, Los Angeles (UCLA) in the Department of Information Studies. She approaches the topic of the use of algorithms to access information as an expert on Information Studies as well as a black feminist. This focus allows her to show how the problematic nature of Google’s search algorithms are part of a history of bias information categorization that has harmed racialized, and gendered minorities. Dr Noble argues that the problems with googles algorithms are not problems of math or logic; they are human problems with human solutions.   

Algorithms of Oppression Book Cover
Algorithms of Oppression Book Cover

As a future computer professional, this book is both engaging and essential. Though a few years old at this point (the book was published in 2018, and even at this point Googles search engine had started to change, eliminating the problematic racist and sexist search results) and functions as more of a history of the algorithms used than as a critique of the algorithms google currently uses. The arguments and examples raised by Dr Noble are essential to think about as they highlight the need for human oversight in the deployment of new technologies, and the danger of assuming that just because an algorithm ‘works’ that this is good enough.  

Dr Noble highlights the issues with trusting a private corporation with distributing information on the internet as we do with Google. Google’s primary interest is not in providing its users with accurate knowledge, but in increasing their profit margins. Because of this, the search results provided by Google have historically favoured the interest of advertisers. Yet most people trust that Google will return accurate information related to our requests. Both Google and its users have treated the algorithm behind Google searches as if it is some mysterious or magical thing that will produce results that Google has no responsibility for or control over. Throughout the book, Dr Noble shows that this is not true and Google’s response of not taking responsibility for the problematic search results such as returning links to porn sites when searching for ‘black girls’ is not acceptable. Developers of technology must take responsibility for their creations. Even if the algorithm produces outcomes that were not intended the developers and the companies behind the developers must take responsibility and commit to improving their algorithms.   

Throughout the book, Noble reminds the reader of the human element. Real people are affected by these algorithms, and there are real people behind the algorithms. Computer professionals must remember that the algorithms that we use and develop are no more infallible than we are. People often view algorithms as revealers of objective truth. This idea of algorithms as truth-seekers has led to the belief that the first page of a Google search contains information that is true, relevant and a reflection of society. If when performing a search for ‘Jew’ returns a list of anti-Semitic webpages, the user might believe that the world must be full of anti-Semites, or that the beliefs of the anti-Semites are correct. This reliance on the objective truth of algorithms plays into the myth that the internet is a place where objective truth prevails. This myth is even more persuasive if the search returns information that agrees with the ideas held by the majority. We must encourage people to think critically about the use of algorithms in society. To this end, it is essential to educate people on what algorithms are and their shortcomings.  

Dr Noble points out that when a search result for ‘black girls’ returns links to pornography, it is not just that the search engine is returning misleading and inappropriate information but that they are misrepresenting a group of people. The issue of representation is a central theme in the book. When search results for racialized and gendered minorities return inappropriate and misleading information from sources outside of those communities, we are robbing the people that belong to those communities a chance at self-representation. Because so many people trust the algorithms that are behind search engines, they do not go beyond the first page. If that first page is not returning links to people that they are supposed to represent the search engine is silencing those voices. Noble presents many examples of how this lack of representation has harmed both individuals and as groups.  

I found one of the most intriguing observations that Dr Noble makes is that most of the problems that she brings up regarding googles search algorithms are nothing new to the Information Sciences. The same issues are found in library indexing systems that show a distinct bias to the dominant majority while underrepresenting the minorities. The same lack of representation could be found in television and movies. Dr Noble shows that there is a need for library and information science experts in the tech industry, as well as members of underrepresented minorities. These experts can point out flaws in the systems we use that may have never occurred to us.   

In the end, she calls for greater regulation on tech companies such as Google that control the flow of information on the internet, and an end to the monopoly that the tech giants have in that information space.   

Dr Noble’s book highlights the importance of taking into consideration other points of view on the technology we create. The algorithms that we use can and do have wide-reaching implications for the lives of many. We are not always aware of the effects the use of new technology, and these effects can be dire. Something that seems simple, like a search engine can change how people access information and influence the type of information that is available.  

As developers, we may not always be able to consult experts in the social sciences to have them vet our products or give advice on how they may affect people. Still, it is essential to be empathetic. Think about how the algorithms we are using might affect others, encourage criticism for the public, and listen to the criticism presented. Dr Noble’s book is an indispensable read to remind us to stay empathetic and remember the human element. The algorithms we use and create will affect others. We should make sure that the effect that they have is a positive one.