Category Archives

12 Articles

Week #3 Response/Week #3 Response

Week 3

Posted by Gabriel Almonte on

Gabriel Almonte 

 

The main point of this episode was to discuss ethics in technology. The episode featured a futuristic robot that had many similar characteristics and functions as humans, the robot could think and had feelings as well. From my understanding it seemed like everyone involved was in the same company but people from different positions of power and different branches were all disagreeing with what was fair to the robot and the future of technology. One branch was fighting for the robot to be left alone because he thought of the robot as a person, the robot developed into what it was in front of him, he grew sentimental love for it. A developer from another branch viewed it as a piece of technology. Therefore, he had no problem with the risks that working on the robot caused. He was only worried about the future of technology. I agree with both sides because if developers can’t work on new technology it ill be hard to make the same technological advances that are occurring right now. I also agree with the other side because seeing people develop these new advances, they can feel n attachment to the product that spent so much time on. I also agree that on some point some technology will be considered human because of how similar it is to normal people. There was one character who compared the unethical beliefs of the other branch to racism. I agree because at a certain point these robots can become almost identical to humans the only difference being the way each group is made. The only thing that must be solved is at what point do these robots are considered humans.  

Week #3 Response/Week #3 Response

Week 3

Posted by Carlton Yuan on

Algorithms of Oppression by Safiya Noble is about how algorithms and technology have promoted racism and sexism. The author talked about the problem with Google’s search engine in order to support her argument. I found this book excerpt to be really interesting because I never really considered Google’s search engine to be racist or problematic. One example that showed the problem with Google’s algorithms was that African Americans were being tagged as apes or animals. Another example was that when searching the word N*gger on google maps, the result would be the white house. Another problem the author discussed was whether the information on Google is real. A lot of times when using searching up something on google, a paid advertisement would be the number one result. I agree that this a problem because most people would click on the first link believing it is the most popular and trusted but actually it is a paid ad.

I wanted to learn more about Google’s search engine by watching a video on youtube of Google’s CEO Sundar Pichai responding to Congress. One congresswoman asked the question as to why when searching up the word idiot under images, a picture of Donald Trump comes up. Pichai said that the search engine matches the key world with millions of online pages. Google determines what pictures are the best match for the keyword based on relevancy, popularity and how other online users are using the word. Pichai also says that there is no one manually choosing pictures to match with keywords. I believe the problem is not only Google’s algorithms but people who use the internet. Google handles about 3.5 billion searches in a single day so it would be difficult to monitor racist or sexist search results. It would also be hard to program the search engine to differentiate what is racist and what is not since it is not human.

Week #3 Response/Week #3 Response

Geetangalie’s Week #3 Response

Posted by Geetangalie Goberdan on

Star Trek: The Next Generation’s episode “The Measure of a Man,” is based around an argument whether an artificial intelligence deserves human-like rights. Lieutenant Commander Data is an android, which is defined as an automaton made to resemble a human being, made by Dr. Noonien Soong. Commander Bruce Maddox has come to take Data, so he can use him as research on how to create another being like him. Since there is only one of his kind, Commander Bruce would need to take Data apart and risk him not being able to put him back together successfully. Under Captain Picard’s advice, Data decides to resign so he would not be submitted to the procedure. This is when there is a debate whether Data is the property of Starfleet command, therefore does not have the right to resign. Opposing the ruling, Captain Picard requests a hearing where Commander Bruce Maddox regards Data as non-sentient. When asked how to define someone as sentient, he says they must possess three qualities; intelligence, self-awareness, and consciousness. Commander Bruce argues that there is no way to prove that Data does not have even the smallest amounts of all three qualities. The Starfleet court ultimately rules that Data has the right to choose. This case is one difficult to choose opinions on, as artificial intelligence are generally created for their intelligence, not to mimic humans. But, an android specifically is made in the image of a human, and as they procure human-like emotions it can be understood the desire for their rights.

Algorithms of Oppression by Safiya Noble is an eye-opening composition as it sheds light on a topic many people are ignorant to. The basis of the book is surrounding the racial discrimination against people of color, in particular women in technology. Safiya explains how Google’s search engines are prime examples of this racism and sexism. Google has unnoticeably used their search engines to form our minds and lives by making us associate particular words with negative racist and/or sexist connotations. They, in turn, have been biased to the white community, giving them privilege by always portraying them in “perfect” light.

Posts/Weekly Responses/Week #3 Response

Weekly #3

Posted by Jaspreet Jaswal on

In the text ‘Algorithms of Expressions’ by Safiya Noble, Noble analyzes on the schematics of algorithms and how its values shape the social reformations involving racial profiling. At first, Noble describes the schematics by introducing the concept of how algorithms exhibit a sense of visibility which is then perceived by the society in a decision-making way. Noble gave examples of those who were involved in the “decision making” by describing various professions such as bankers and real estate agents and how they manufacture inequalities in these fields. Noble gave an example by stating- “people of color are more likely to pay higher interest rates or premium just because they’re black or Latino” ( Safiya 1). This statement shows how the mobilization algorithms rather demarcate the border of inequality rather than allowing everyone, no matter their creed or race to flourish with their profound knowledge.  Noble continues her argument by stating that even on the internet, racial discrimination is evident through coding in technology. Personally, this hit me since it’s honestly crazy how people are finding ways to converse these unethical statements and put it out as an anonymous user. As technology seems to advance, the creation of artificial intelligence will also grasp onto these statements. As Noble describes the work field later, another aspect that stood out to me was Google’s wage gap between men and women. Now, prior to reading Noble’s text, I knew wage gaps existed, but I didn’t know that a multibillion-dollar company really discriminates its workers based on gender.  As google advances their technological field, its honestly shocking that google workers are supporting the insane claims made by James Damore of how women are inexperienced and inferior in terms of software programming. Using search engines to promote racial slurs is honestly disgusting and should be stopped right away

 

Jaspreet Jaswal

Week #3 Response/Week #3 Response

Sameer’s Response #3

Posted by Sameer Kunwor on

Conversations about ethics and racism, A.I. and algorithmic ideologies

As stated by great Martin Luther King, Jr “The ultimate measure of a man is not where he stands in moments of comfort and convenience, but where he stands at times of challenge and controversy”, I kind of related to this awesome episode of TNG’s ‘The Measure of a Man’. The way Data initially reacts to being told he has no rights. He takes what would for any man be a reason for outrage and instead approaches the situation purely with logic. He has strong opinions on the matter, but he doesn’t get upset, because that’s outside the scope of his ability to react. His reaction is based solely on the logical argument for his self-protection and his uniqueness. And at the end, after he has won, he holds no ill will toward Maddox. Indeed, he can sort of see where Maddox is coming from. Overall, I agree with the fact that all beings are created but that does not necessarily make them the property of their creator and since no one is owned, everyone has the right to make their own decisions regarding their life.

Safiya Umoja Noble’s ‘Algorithms of Oppression’ is interested in the consequences of the cultural transition of digital technology from visionary to the framework of everyday life. Although Noble focuses mostly on Google and its parent company, Alphabet, her argument applies literally to Amazon, Facebook, Twitter, and WordPress. Noble explains how Google replicates the harsh material history against black women through the mechanisms that operate its search engines and advertising policies. Although her primary focus is on the relationships between Google and black American women, her argument always has an eye on other historical forms of oppression. It may be that the monopoly of Google and similar infotech corporations have grown large enough to creep into all public life or it may be that the racial, gendered, sexual, and classed social life have traditionally targeted black women in America so particularly that there are very few oppressive tactics that have not been applied to black women at some point or another. In any case, Noble always has an eye on a wider public while she attends to the historical differences of American black communities.

References:

https://via.hypothes.is/https://writing4engineers2019.commons.gc.cuny.edu/wp-content/blogs.dir/6105/files/2019/01/SAFIYA-NOBLE.pdf

Week #3 Response/Week #3 Response

Reem’s Week #3 Response

Posted by Reem Malek on

At the point when Data leaves his commission instead of be destroyed for examination by a deficiently gifted researcher, a formal hearing is gathered to decide if Data is viewed as property without rights or is a conscious being. When Data’s rights as a sentient individual are placed under trial, Starfleet forces Riker into a position where he must prove that Data is only an Android. I trust that androids shouldn’t have any rights since they are not people and can be unusual and can overwhelm us humans.It is seen that Data is viewed as a property of the armada and can’t be surrendered on the grounds that he is a robot and not a human and doesn’t have any rights. An individual can be viewed as somebody that have the attributes of knowledge, mindfulness, and a blackout. What I have perused from the extract demonstrated to me that we indulging into the psyches of the youthful about racism through the web. The web has turned into a spot where there is no channel to anything. Likewise, it isn’t right that individuals influencing programs for web engines  have the right away to bolster into sexist and racist dispositions.The show begins with the team individuals playing a game of cards. Perusing outward appearance is critical to card games. I was reasoning what the scene was about from simply the title and assumed that it was tied in with testing what man can achieve. Prior to finishing the episode, I read the excerpt about algorithms in search engines, for example, google. I trust that one day the PCs will assume control over the world and convey humankind to termination. The information fellow creeps me out with his fake tone endeavoring to seem like somebody with no feelings in their voice. Artificial intelligence gains as a matter of fact not from books as indicated by what Data expressed I trust that they will never have genuine feelings like us living creatures.  

 

Week #3 Response/Week #3 Response

QianXing’s Weekly Response #3

Posted by QianXing Ou on

In the episode, “The Measure of a Man”, it is mainly about the conflict between the rights of a machine.  In this episode, there is a robot named Data. Data is a high intelligent robot who are very similar to a human.  Because of that, a commander wants Data to be his experimental tools. He wants to use Data as a tool to develop and recreate many robots who is just like Data. but Data will die because of that.  Then the captain of the star ship disagree with that because he consider Data as his crew, not a robot. After a debate, the captain win and finally Data will not be killed.

Even though in this episode, it implies that technology, such as robot has the right just like our human beings and not treating them well is like treating them as slaves, I don’t really think that is realistic in the real world.  First of all, machines are not slave, even though we use them. They don’t have a life, feelings, or anything that a normal human should have. Also, they are created by human beings to make their lives easier. People do not create them because they want to make more people or more lives.  Also, I don’t really think that humans will not treat their machines with violent since they are really expensive to get,(at least myself) but updating them is reasonable. No one wants to spend 100 hours just waiting for your computer to restart. Therefore, comparing them with slaves in the episode is not really appropriate. Lastly, it is not reasonable for machines to have their freedom and do everything they wish because even our humans have unequal rights and freedom.  People are discriminated against by each other based on their race, color or culture. If machines are granted rights, then what about those humans who do not? Does that mean they are even worse than the machine? Thus, I really think if we can’t even get our own rights, there’s no point of getting a machine’s right.

After reading the excerpt, Algorithms of Oppression, I feel really depressed.  I never notice that searching engine is one of the weapon that people use to attack other people. The discrimination online is much greater than what I thought.  I never know people use words like “sex” to bind them with black girls. I feel really bad about them.I think we are all humans and have identical genes,( other than skin color) why would people discriminate against them? I am one of the victim of discrimination too. When I just came to America, I got discriminated against by my classmates because I don’t speak English. I know how it feels, so I don’t want it to happen on anyone. Also, there are always sexism happening online.  I have seen many comments about that such as “ girls should stay home’ I just don’t understand it. A lot of girls are smarter than boys. Girls and boys are same other than biological structures. Without girl, life will be a lot difficult today. Therefore, please treat girls with care.

Week #3 Response/Week #3 Response

Week 3 response Ming Hin Cheung

Posted by Ming Hin Cheung on

Week 3 response

In Algorithms of Oppression, Safiya Umoja Noble point outs the idea that search engines like Google doesn’t offer an equal playing field for all forms of ideas, identities, and activities. Noble  doubts that some of the developers are willing to promote sexist and racist attitudes openly at work using some searching algorithms and architecture. One of the example i see on the reading shows that when you type “professor”, it mostly shows males but not females, it shows that data discrimination is a real social problem; Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color.In my opinion, I think that people shouldn’t use colors or sexes to identify curtain people or groups, most of the searching sites show certain results not only because of the searching algorithms, it’s also because people are searching them too. People should start to accept others for what’s on the inside and not the outside. we the people need to understand that we are all equal no matter the race or genders.

Week #3 Response/Week #3 Response

Hakeem Leonce Week 3 Response

Posted by Hakeem Leonce on

After reading the excerpt from Safiya Noble – Algorithms of Oppression I came to a deeper understanding of the “numerical” influences of social grouping and its powerful impact of the things we see, believe and most importantly follow. On a general scale, the digital decision-making society call fair and as far from race and/or sex driven is and will always be created by men and women who are in fact the same thing they are attempting to not create. So to believe that they can create this vacuum of ideal prediction without any pre-judgements is false. These codes, and the underlying ethics of technology comes from minds that are asked to predict the future. The only way you can make a proper hypothesis, is by studying the past and present to make an educated guess of what the future will be. If this is in fact true, you cannot just blur the information of all the “others” of civilization no matter the time period.

 

In the excerpt there was a part that stated, “At the very least, we must ask when we find these kinds of results, Is this the best information? For Whom?”. This short line contended all that algorithms represents. To ask a program, to make sure you take in consideration all people from near and far before making your decision is impossible. So ultimately, these outcomes made will not only be for a specific group but it was also made by a specific group adding to the inaccuracy it has for all types of individuals within the huge spectrum of men and women which itself has blurred lines of identifications.

 

Week #3 Response/Week #3 Response

Weekly Response #3

Posted by Tanvir Youhana on

In the show Star trek, they were debating if Data, a andriod, can be seen as a person with the same rights as a human, At the beginning of the argument, I beleived that andriods dont have any rights but the person defending Data changed my mind. He convinced me that Data can have right becuase he is intellegents, self-aware, and has a concussion. He proved to me that all these traits can be seen in Data as well as another statement said by him. He stated, ” Humans give birth to other humans that also have rights”. Why can andriods that are also created by humans also have the same rights. These are two great points made in the show, but I still beleive that even though AIs are great and all, they have the power to take over humasn because they are able to think many steps ahead of us as well as are able to learn faster than us humans. On the other hand, the excerpt from the book tells us that the internet isnt a safe place becuass of how it was designed and programmed with algorithims that promote, “sexist and racist attitudes openly at work and beyond.”

-Tanvir Youhana

 

Skip to toolbar