• We are looking for you!
    Always wanted to join our Supporting Team? We are looking for enthusiastic moderators!
    Take a look at our recruitement page for more information and how you can apply:
    Apply

Morality of Clones

mamboking053

Well-Known Member
While autoing through some battles, I realized that if human drone technology (biological drones) were ever created, this would likely be one of their most used applications and they would probably be sent into war in a similar fashion- if not cutting rocks on Mars or something.

Given some of the debate surrounding abortion, I think this hits the same buttons. What is considered "living" and is it moral to subject a living being to an opportunist or utilitarian cultural or sub-cultural philosophy? (ie, would it be moral to use clones for warfare or to create clones at all that are commodities and tools rather than living beings?)
 

Super Catanian

Well-Known Member
There is a video on YouTube made by Kurzgesagt that somewhat covers that subject.

Personally, I believe that something is considered "living" when it has 100% the potential to gain consciousness. I mention this to include unborn babies. Although (in my eyes, anyways) they can't think freely, as they are just a cluster of zygotes, they have that potential to gain consciousness, once they are born and learn more about their world. Robots can't do that, since they are merely programmed with knowledge, but cannot think freely by themselves, and probably never will be. That is what separates the living from artificial intelligence.
If human clones were to be created, they probably would have consciousness, since they are exact carbon copies of a regular human being.
 

mamboking053

Well-Known Member
There is a video on YouTube made by Kurzgesagt that somewhat covers that subject.

Personally, I believe that something is considered "living" when it has 100% the potential to gain consciousness. I mention this to include unborn babies. Although (in my eyes, anyways) they can't think freely, as they are just a cluster of zygotes, they have that potential to gain consciousness, once they are born and learn more about their world. Robots can't do that, since they are merely programmed with knowledge, but cannot think freely by themselves, and probably never will be. That is what separates the living from artificial intelligence.
If human clones were to be created, they probably would have consciousness, since they are exact carbon copies of a regular human being.

This is my personal opinion...

But I don't think we think freely, either. I think the only real difference between current robots and humans is that our neurological circuitry has thousands of years of evolution on it's side. Robots are programmed with a limited number of logical pathways. Humans are also, but we have many, many more of them and so our greater range of options may give the illusion of free choice rather than more choices.

For instance... Imagine if a robot is only programmed to be able to go north, south, east, or west. Tell it to follow a rabbit and no matter how fast the robot could go, it will have a difficult time because the rabbit has more options of movement.

Reprogram that robot it now be able to travel north-east, north-west, south-east, and south-west along with the original cardinal directions, and it has an easier time. It now has more options and it's movements are less clunky.

Similarly, the computers of today can "think" far better than their most distant ancestors. The computers today are approaching a resemblance of the fluidity of actual biological autonomy. The more option we program into them, the greater this will become. From my perspective, I ask myself the question "Was mankind, animals, and all their associated relatives and ancestors always autonomous, or was their a point where they, too, were in a state that would be considered non-organic and they became organic and then they became autonomous? If so...how? And if so...couldn't this process be repeated, especially when humans are effectively expediting the process of evolution by trying to intentionally create things that are autonomous?"
 

mamboking053

Well-Known Member
How about we don't ever create stuff like that so we don't have to ever confront the answers to the questions you ask. This is a perfect example of, "Just because you can, doesn't mean you should." But ... "As in the days of Noah..."

Unfortunately, morality has never gotten in the way of innovation before and I doubt it ever will.

"If it can be done it will be done".
 

DeletedUser37617

Unfortunately, morality has never gotten in the way of innovation before and I doubt it ever will.

"If it can be done it will be done".

Yep... Unless they can make more money by not inventing something, that is!
 

DeletedUser37011

Not always are inventions a matter of making money. Governments spent tons of money to invent nuclear weapons. Companies later made tons of money building them, but the first were built without any profit motive in mind. And these are terrible things. As stated earlier, if it can be built, we will build it. This always happened in the past and will in the future unless we go extinct.

As for morality, it is just a human construct. It sounds good, but what is considered moral is very fluid. It used to be moral to burn witches. It was considered immoral to make the masses suffer a drought and famine rather than sacrifice a virgin. Some people consider murder as immoral. I disagree as I was in the military. It does not matter if we murder. Nothing matters. But we do have wants. And to satisfy our wants, we often need community. And this is where it is needed to have law and order.

Most people want their children to have a better life than theirs. We want the next generation to have more food, better housing, etc. And we are social animals that can work as a community toward this direction. This community requires us to be social. Murder is the ultimate social faux pas and can dissolve a community. It only makes sense to outlaw such actions for the sake of the community.

As for clones, the question is not are they moral. A better question would be to ask if they would be beneficial to our community. Hopefully, we will not bring about a Brave New World setting with Alpha, Beta, Gamma, and Delta clones. That my be impossible as our ability to work in a community is dependent both on our genetic structure and our environmental experiences. Clones are exact duplicates yet if they are reared in different ways they would attain different personalities and traits. Thus, it may be impossible to make the working class clones of Brave New World.

We are a long way from cloning ourselves. Of more immediate concern is our genetic engineering capabilities. That is happening right now as well as the fast approaching General Artificial Intelligence. We already have specialized AI. Generalized AI would most likely become aware. Both these technologies could be disastrous to our community or they could be the best thing to ever happen. How do we make sure they follow the latter path?
 
Top