#Week6

Data had no one with experience to help him into sentience, but Lal has Data. Data is considered sentient but he does not feel, Lal does eventually. What is the difference? Maybe it is that Lal has another like her to empathize with? Or is it possibly an inevitable part of creating a more aware machine? It is strange to think about treating anything with the same regard we give ourselves as a species. There is no example of us having an equal in our environment. We have made it our mission to dominate everything else around us, so why do we want to create a rival for ourselves? It seems to be accepted that the existence of a superior intelligence to ourselves would benefit us greatly, but at what potential costs? It seems to me that it would be dangerous to create anything that is dominant to ourselves using the assumption that we can keep it handicapped and enslaved as our safety net.

The portrayal of Data as a parent is also interesting, as it makes the AI a self sustaining life force of some kind. In this situation, the AI can truly become a successful competitor in our shared environment, as it can proliferate itself. The fear from the Admiral that the new android creates is rooted in the fact that the android presents some form of threat. The ethical dilemma of when to define something as equal to ourselves is one that we cannot possibly fully understand until we can look at it with hindsight unfortunately. The ability to procreate is certainly a characteristic we associate with being alive. I believe that it is theoretically possible that ‘life’ can exist is a technological manner, but it has yet to be seen.

Skip to toolbar