1. Post #121
    RAPISTS ARE OPPRESSED
    mobrockers2's Avatar
    April 2011
    12,403 Posts
    Rights are supposed to be founded on moral principles.
    So?

  2. Post #122
    TheFallen(TF2)'s Avatar
    September 2009
    2,020 Posts
    I have a very strange view of things when it comes to Artificial Intelligence. If a form of artificial intelligence displays traits associated with semi-sentience or full sentience then I say let them have rights.

    Besides we don't know what self-aware AI will be like in the future, we don't even know it they will have malevolant or benevolant attitudes towards us humans. One thing we also have to consider what we humans will be like in the future as it may impact their views of us.

  3. Post #123
    You should also give two tugs of a dead dog's cock about rights in a moral sense. If you are talking about rights in a 'practical' sense then you're not talking about rights.
    If you think morally, we should do things for a reason. If we do have a purpose (we we were designed in a kind of 'masterplan' instead of being just an accident) what we SHOULD do is to fulfil that purpose.
    purpose is an entirely artificial concept, existing only in minds.

    there aren't little particles with "purpose" charge on them, interacting through some kind of "purpose" field.

    Edited:

    Rights are supposed to be founded on moral principles.
    they aren't "supposed" to do anything.

    they're a human invention, we can ascribe whatever purpose to them as we desire.

  4. Post #124
    Gekkosan's Avatar
    October 2010
    5,668 Posts
    Do you not understand what hypothetically means?
    If you mean pointless babble mostly, yes.

  5. Post #125
    GetOutOfBox's Avatar
    April 2009
    847 Posts
    Anything sentient deserves rights. If a computer is made that is sophisticated enough to qualify as being "alive" in the sense that it thinks by it's own accord and has feelings (not nessesarily in the same way as humans), it deserves to be treated with respect. After all, as humans, we are nothing more than complicated biological machines, our brains are merely vastly complex biological circuits.

  6. Post #126
    Crom's Avatar
    February 2012
    7 Posts
    The mighty question of "is this thing showing complex self awareness, that is not us, allowed to mingle within our established base of human rights?" is being boiled down. only reason this might be objected in anyway is because unlike another biological creature with that same criteria (say that we gave an alien these rights), the AI is a hand made being existing without any doubt of where it came from. which we cannot relate to ourselves.

    the boiled down question remaining for the opposing party i would think is "It has no soul, does it matter if it has a choice?"

    We are used to giving animals this excuse, they don't have the same sentience as us, so we prescribed them as having no soul. which we are projecting to this new thing that we have made. i am fairly confident in this assumption, despite not standing in the position.

  7. Post #127
    Gold Member
    squids_eye's Avatar
    July 2006
    5,765 Posts
    If you mean pointless babble mostly, yes.
    If you don't want to contribute to the debate, don't come in and post.

    In my opinion if an AI has been programmed to think exactly like a human then it should be considered a person and given the rights a person deserves.

    Edited:

    Rights are supposed to be founded on moral principles.
    What moral principles restrict a new form of life recieving rights?

  8. Post #128
    Satansick's Avatar
    September 2009
    2,291 Posts
    Sentient AI is a pretty retarded idea on it's own ,

  9. Post #129
    matsta's Avatar
    September 2009
    347 Posts
    purpose is an entirely artificial concept, existing only in minds.

    there aren't little particles with "purpose" charge on them, interacting through some kind of "purpose" field.

    Edited:



    they aren't "supposed" to do anything.

    they're a human invention, we can ascribe whatever purpose to them as we desire.
    I don't get why it's wrong that rights or purposes are concepts. And, of course, if you don't have moral principles nothing is "supposed to do anything". But the question of the thread is "SHOULD a self-aware/sentient AI have rights".

    Edited:

    Following your criteria, sentience would also be an entirely artificial concept. You won't find 'sentience' particles either. Actually, you won't find anything regarding 'sentience' directly out there.

  10. Post #130
    The Kakistocrat's Avatar
    November 2011
    1,353 Posts
    Why should they have rights? they were created by us, and therefore serve us. They are not living. Also, because they are programmed, their "thoughts" could be altered somewhat easily, making them not really a sentient being.

  11. Post #131
    matsta's Avatar
    September 2009
    347 Posts
    Anything sentient deserves rights. If a computer is made that is sophisticated enough to qualify as being "alive" in the sense that it thinks by it's own accord and has feelings (not nessesarily in the same way as humans), it deserves to be treated with respect. After all, as humans, we are nothing more than complicated biological machines, our brains are merely vastly complex biological circuits.
    'sentient AI' doesn't mean 'living AI'. Also, the a machine is sentient doesn't mean it has feelings. The discussion here is about sentient machines, but not necessarily 'living' machines.

  12. Post #132
    Dennab
    July 2009
    7,650 Posts
    This is one of those questions where we will never know until it happens.

  13. Post #133
    matsta's Avatar
    September 2009
    347 Posts
    What moral principles restrict a new form of life recieving rights?
    What moral principle says that anything sentient deserves rights? Also, as I said before, we're not talkign some something 'living' here.

  14. Post #134
    boy i sure do love it when my title doesnt fit
    LuaChobo's Avatar
    December 2009
    6,417 Posts
    If an AI is fully self aware it has life, If it knows it's there for scientific/whatever reasons it would have minimal rights like the ability to have conversations and etc.
    If it fully believes it is a human it should have human rights as you have basically created life and should treat it as a living person.

    Though thats just my opinion.
    Also, never teach them how to use guns, That right is nulled as soon as they are given network control.

  15. Post #135
    HulaDancer's Avatar
    February 2012
    56 Posts
    This is of course a theoretical question since there hasn't been one yet, but I feel it should have just as many rights as a human, seeing as we are quite similar, having a relationship between mind and brain is similar (if not identical) to the relationship between a running program and a computer.


    My main fear about this though, is that religious people will most likely try to destroy it due to it being "blasphemy"




    If you haven't, you should really watch the first ghost in the shell movie, it tries to handle this specific question.
    Cleverbot is blashpemy

  16. Post #136
    Following your criteria, sentience would also be an entirely artificial concept. You won't find 'sentience' particles either. Actually, you won't find anything regarding 'sentience' directly out there.
    yep

    Edited:

    Also, because they are programmed, their "thoughts" could be altered somewhat easily, making them not really a sentient being.
    our thoughts can be altered easily

  17. Post #137
    sgtshock's Avatar
    March 2010
    268 Posts
    Before we determine whether or not an AI deserves rights, we need to figure out if it's actually conscious. As far as I know scientists only have theories on what consciousness/self-awareness actually is, so we can't know for sure the difference between a truly self-aware being, or a very clever, very convincing, yet not actually conscious machine.

  18. Post #138
    we can't know for sure the difference between a truly self-aware being, or a very clever, very convincing, yet not actually conscious machine.
    you need to demonstrate that there actually is a difference first

  19. Post #139
    Gold Member
    Splarg!'s Avatar
    September 2005
    2,429 Posts
    If we ever make very complex AIs that are self aware and have qualia/experiences, they would have to have rights.

    If we had humanoid robots walking around that were not aware, there would be some complicated laws regarding how to treat them properly, since they are someone's property, and that needs to be respected.

  20. Post #140
    matsta's Avatar
    September 2009
    347 Posts
    Oh. I get it. You are one of those guys who believes their entire life is an illusion. What do you even have to say in a debate like this?

    Edited:

    If we ever make very complex AIs that are self aware and have qualia/experiences, they would have to have rights.

    If we had humanoid robots walking around that were not aware, there would be some complicated laws regarding how to treat them properly, since they are someone's property, and that needs to be respected.
    Yes, but how would you know if the AI has qualia or not? If we think on the development of a sentient machine, there would be a moment when we seriously would not know if the machine is sentient or not.

    And if we did know that the machine experiences qualia we wouldn't know how does the machine feel if it isn't human. How do you attempt to know a particular feeling you have never felt? Rights can't be made based on completely uncertain suppositions.

  21. Post #141
    Gold Member
    Splarg!'s Avatar
    September 2005
    2,429 Posts
    Yes, but how would you know if the AI has qualia or not? If we think on the development of a sentient machine, there would be a moment when we seriously would not know if the machine is sentient or not.

    And if we did know that the machine experiences qualia we wouldn't know how does the machine feel if it isn't human. How do you attempt to know a particular feeling you have never felt? Rights can't be made based on completely uncertain suppositions.
    I didn't say if or how we'd be able to, but that seems like where you'd draw the line.

    Also, I'm not quite sure what you're saying in the second bit.

  22. Post #142
    matsta's Avatar
    September 2009
    347 Posts
    I didn't say if or how we'd be able to, but that seems like where you'd draw the line.

    Also, I'm not quite sure what you're saying in the second bit.
    I'm saying that we can't know how does some A.I. robot experiences qualia ("what do they feel") if their brain and body structure is different from ours.

  23. Post #143
    ShadowSocks8's Avatar
    November 2007
    2,462 Posts
    If something is capable of begging for its life, not for the sake of functionality, but for the sake of being alive, then I consider it sentient and to be protected similar to humans.

  24. Post #144
    matsta's Avatar
    September 2009
    347 Posts
    If something is capable of begging for its life, not for the sake of functionality, but for the sake of being alive, then I consider it sentient and to be protected similar to humans.
    Then a whole lot of living beings would have to be protected in a similar way than a human is, yet they aren't for a reason.

    This isn't going anywhere. Most of the argument's I see here presuppose that if something is sentient then it deserves rights but give little definition on what they understand as sentience.

    One could hardly think of some animals as not being sentient, yet we do not treat them as humans. And, even if they interact with society in many ways and have rights (in some countries), they are seen as 'something a human could use for their purposes' and not seen as an end.

    I already argued about why I think human rights exist. There IS a difference between humans and other sentient beings (until now): humans are purposeless. That means that humans don't act in a way or another or define themselves (just) according to a pre-established nature, they don't "have to". As every other living being, they are born, they reproduce, they eat, they (normally) preserve themselves, etc. but they are not defined by that (as every other living being, including sentient ones, is).

  25. Post #145
    Oh. I get it. You are one of those guys who believes their entire life is an illusion. What do you even have to say in a debate like this?
    no no no

    I meant that consciousness isn't a fundamental property, not that it doesn't exist.

  26. Post #146
    Gold Member
    lil timmy's Avatar
    February 2006
    2,291 Posts
    you have no idea how hard it is not to pepper this thread with HAL 9000 dialogue clips.

    Personally, assuming that they do have what we would define as full or partial sentience, then yes, artificial intelligences should be afforded rights to some degree/the same degree as humans depending on the situation, but then again...
    it's a complicated issue I doubt we'll be able to answer until the advent of what we're talking about.

  27. Post #147
    matsta's Avatar
    September 2009
    347 Posts
    no no no

    I meant that consciousness isn't a fundamental property, not that it doesn't exist.
    A fundamental property of what?

  28. Post #148
    A fundamental property of what?
    the Universe

    eg quarks are fundamental things while atoms are not

  29. Post #149
    mustachio's Avatar
    March 2010
    50 Posts
    It really depends on what level of intelligence they have. You could give the two smartest people on the planet more rights than you're average slobby teen and have no problems. I mean would you let the fry cook from burger king run the mission to mars?

  30. Post #150
    Gold Member
    Nikita's Avatar
    April 2005
    1,927 Posts
    It really depends on what level of intelligence they have. You could give the two smartest people on the planet more rights than you're average slobby teen and have no problems. I mean would you let the fry cook from burger king run the mission to mars?
    A fry cook is perfect for mission to mars, because he'd die a fiery death that tells the smart people what to do better.

  31. Post #151
    Gold Member
    Archonos 2's Avatar
    June 2005
    1,582 Posts
    -snip-

    I'm a week behind on this one, sorry.