Daily Archives

3 Articles

Week #3 Response/Week #3 Response

Week 3

Posted by Carlton Yuan on

Algorithms of Oppression by Safiya Noble is about how algorithms and technology have promoted racism and sexism. The author talked about the problem with Google’s search engine in order to support her argument. I found this book excerpt to be really interesting because I never really considered Google’s search engine to be racist or problematic. One example that showed the problem with Google’s algorithms was that African Americans were being tagged as apes or animals. Another example was that when searching the word N*gger on google maps, the result would be the white house. Another problem the author discussed was whether the information on Google is real. A lot of times when using searching up something on google, a paid advertisement would be the number one result. I agree that this a problem because most people would click on the first link believing it is the most popular and trusted but actually it is a paid ad.

I wanted to learn more about Google’s search engine by watching a video on youtube of Google’s CEO Sundar Pichai responding to Congress. One congresswoman asked the question as to why when searching up the word idiot under images, a picture of Donald Trump comes up. Pichai said that the search engine matches the key world with millions of online pages. Google determines what pictures are the best match for the keyword based on relevancy, popularity and how other online users are using the word. Pichai also says that there is no one manually choosing pictures to match with keywords. I believe the problem is not only Google’s algorithms but people who use the internet. Google handles about 3.5 billion searches in a single day so it would be difficult to monitor racist or sexist search results. It would also be hard to program the search engine to differentiate what is racist and what is not since it is not human.

Week #3 Response/Week #3 Response

Geetangalie’s Week #3 Response

Posted by Geetangalie Goberdan on

Star Trek: The Next Generation’s episode “The Measure of a Man,” is based around an argument whether an artificial intelligence deserves human-like rights. Lieutenant Commander Data is an android, which is defined as an automaton made to resemble a human being, made by Dr. Noonien Soong. Commander Bruce Maddox has come to take Data, so he can use him as research on how to create another being like him. Since there is only one of his kind, Commander Bruce would need to take Data apart and risk him not being able to put him back together successfully. Under Captain Picard’s advice, Data decides to resign so he would not be submitted to the procedure. This is when there is a debate whether Data is the property of Starfleet command, therefore does not have the right to resign. Opposing the ruling, Captain Picard requests a hearing where Commander Bruce Maddox regards Data as non-sentient. When asked how to define someone as sentient, he says they must possess three qualities; intelligence, self-awareness, and consciousness. Commander Bruce argues that there is no way to prove that Data does not have even the smallest amounts of all three qualities. The Starfleet court ultimately rules that Data has the right to choose. This case is one difficult to choose opinions on, as artificial intelligence are generally created for their intelligence, not to mimic humans. But, an android specifically is made in the image of a human, and as they procure human-like emotions it can be understood the desire for their rights.

Algorithms of Oppression by Safiya Noble is an eye-opening composition as it sheds light on a topic many people are ignorant to. The basis of the book is surrounding the racial discrimination against people of color, in particular women in technology. Safiya explains how Google’s search engines are prime examples of this racism and sexism. Google has unnoticeably used their search engines to form our minds and lives by making us associate particular words with negative racist and/or sexist connotations. They, in turn, have been biased to the white community, giving them privilege by always portraying them in “perfect” light.

Skip to toolbar