Daily Archives

26 Articles

Digital Receipt #3/Digital Receipt #3

Digital Reciept #3

Posted by Tanvir Youhana on

The show starts off with the crew members playing cards.

Reading facial expression is key to card games

I was thinking what the episode was about from just the title and I opined that it was about testing what man can accomplish.

Took a break and read the excerpt about algorithms in search engines such as google.

I believe that one day the computers will take over the world and bring humanity to extinction.

The data guy creeps me out with his artificial tone trying to sound like someone without any emotions in their voice.

AI learn from experience not from books according to what Data stated

I believe that AIs will never have true emotions like us humans.

Rights to AIs?

  • I believe that AIs shouldn’t have any rights because they are not humans and can be unpredictable and can overpower us humans.

It is seen that Data is seen as a property of the fleet and cannot be resigned because he is a robot and not a human and doesn’t have any rights.

This attachment that some of these officers in the Starfleet have to the AI is something that is blinding them from seeing that AIs are not humans and are unpredictable.

Data is a machine that is made by humans, so it doesn’t have any rights that a human does.

The person defending data makes a good point that humans are made by other humans thus we can’t just say that those humans have no rights.

It seems that Data has found love as an AI which surprises me and make me rethink my first train of thought about AIs.

A person can be seen as someone that have the traits of intelligence, self-awareness, and a concussion

What i have read from the excerpt showed me that we feeding into the minds of the young about racism through the internet. The internet has become a place where there is no filter to anything. Also, it is not right that people making programs for search engines have the right to feed into sexist and racist attitides.

-Tanvir Youhana

 

 

Assignments/Weekly Responses

Sambeg’s Weekly Response#3

Posted by S Raj on

Sambeg Raj Subedi
ENGL 21007-S
Prof. Jesse Rice-Evans
Weekly Assign#3
02/18/2019

The Measure of a Man, Star Trek, Episode 9 was interesting to watch as it mainly focuses on an issue related to human and technology. Data, a unique character in this movie, is a machine which completely looks like a human and is constructed in such a way that, it is capable of learning, responding and making decisions. These capabilities drew the attention of Maddox and his friend Riker. They aimed to disassemble that creature and make a study in order to recreate more of such machines. This episode moves around a central idea, which is to whether allow machines to have the freedom to use their rights or not, but finally ends up providing them rights, allowing him(Data) to refuse the proposal to disassemble and make an experiment on him.
Though in this movie, a machine was given the full freedom of its right, I don’t believe in the real world, we should allow machines to make the decision of their own. These machines are operated by software and algorithms, which is developed by humans. So, it’s not necessarily true that an operation performed by machine will be 100% correct and reliable. For example, in a newspaper article published in 2016, It was mentioned that predPol, which is an algorithm designed software used to predict when and where crimes would take place, was found intentionally targeting black communities. This shows how software, algorithms and AI are being sexist and racist. Now, the AI system has conquered almost everyplace. Developers are using several algorithms in software to silently withdraw the personal data and information from the user and use in an AI system for their personal interest. For example, we might sometime wonder how Facebook is able to display ads of our interest such as a shoe, phone and bags brand we like the most. Its all because of the AI system which is able to read our brain and, displaying such ads will unknowingly lead the user to buy that product.

Week #3 Response/Week #3 Response

Kayla’s Week 3 Response

Posted by Kayla Ye on

In season 2, episode 9 of Star Trek: The Next Generation, Captain Picard was faced with an issue regarding the android that he considers to be his shipmate, Commander Data. In this episode, Data is forced to give up his body, so to say, in the name of science. While he himself and his fellow crewmates alike do not want to give up this valuable asset towards the team, cyberneticist Commander Maddox insists that since Data isn’t sentient, it is therefore not a being and therefore is a property of the ship Enterprise. The episode brings up the question of whether or not artificial intelligence, like Commander Data This episode, although aired years ago could come to be more relevant today than it was back then. That is because as we advance into society, science is accelerating even faster which means soon, androids like Commander Data would become a relaity, no longer science fiction. Now that too brings up the question of do these “machines” who are capable of acting like humans deserve rights like humans. Are they even to be considered humans or are they a race of their own? From the readings of Safiya Umoja Noble in Algorithms of Oppression, Noble brings up the question of racism being enforced upon these androids. Due to the fact that these androids have to be programmed by someone, then there will be an inevitable source of racism, sexism or any type of bias.

Skip to toolbar