Reem’s Week #6 response

This episode of Star Trek is about a human android,Lal, that was made by Data, a fellow android. Data recognizes the android as his own birthed child, and the android even calls Data dad. Lal begins learning human practices by watching the team on the ship. At last, an issue in Lals brain and makes the android glitch. Data eventually discovers that he couldn’t spare Lal so he deactivated the android. Lal was winding up all the more a human qualities by communicating the love she has towards her dad Data. This raises the topic of whether it was moral to deactivate Lai. Making an android like Lai could be extremely hazardous and it raises the inquiry whether it is moral to make such creation. I am studying electrical building and one piece of the IEE code of morals is to not make whatever future destructive to society. In spite of the fact that the android was not a peril the general public, one day somebody could make an AI android that could assume control over the world. I trust that as long as you realize what you’re doing and have the experience, it is moral to make an android like Lai. In spite of the fact that Data was not able spare Lai, the production of Lai is vital for the progression of innovation. It very well may be said that Lal is the prodigy of Data since he was the one that made Lal. It very well may be additionally that the child lives with the parent until they can fight for themselves. The parent shows them everything about existence and how the world functions, anyway the commander does not think so. He is attempting to break this bond between the parent and the child and I trust that that isn’t moral. He has no option to isolate them as they are associated and he definitely should be understanding on this subject since he is a father himself. If androids somehow happened to duplicate themselves than the world may have a new population of a different breed.

Skip to toolbar