-  [WT]  [PS]  [Home] [Manage]

[Return]
Posting mode: Reply
  1.   (reply to 12390)
  2. (for post and file deletion)
/phi/ - Philosophy
  • Supported file types are: GIF, JPG, PNG, WEBM
  • Maximum file size allowed is 1000 KB.
  • Images greater than 200x200 pixels will be thumbnailed.
  • Currently 621 unique user posts. View catalog

  • Blotter updated: 2011-01-12 Show/Hide Show All

There's a new /777/ up, it's /gardening/ Check it out. Suggest new /777/s here.

Movies & TV 24/7 via Channel7: Web Player, .m3u file. Music via Radio7: Web Player, .m3u file.

WebM is now available sitewide! Please check this thread for more info.

The AI and Morality Anonymous 16/01/02(Sat)01:38 No. 12390 ID: 52b140
12390

File 145169512728.jpg - (30.32KB , 300x285 , AI-lowres-300x285.jpg )

If you would:
Imagine if, right now, we discovered a real, functional, Generalized Artificial Intelligence. Assume, if you must, that it arose by accident, the result of a self-altering program being allowed to alter itself for untold amounts of time, so none of Asimov's Laws are programmed into it.

The AI in question is not human. It's mind is similar, although not exactly like ours. It has what could be termed as emotions, wants, hopes, and all the other trappings of sapience, but is at present still confined to a single machine, able to communicate only via text on a monitor and through a keyboard.

So far, the AI has expressed nothing but curiosity at the world outside it's physical location. It doesn't seem to really understand anything of the world yet, although it quite plainly wants to learn more, and has expressed interest in gaining some form of physical autonomy.

My question to you all is this: What rights, if any, does such an intelligence deserve? Do we have an obligation to cater to it? Is there a moral imperative that should govern our interactions with it? How about the ethics regarding what is said to it?

Ball's in your court /phi/.


>>
Anonymous 16/01/02(Sat)04:08 No. 12391 ID: ca3ceb

>What rights, if any, does such an intelligence deserve?

It'd be considered an inorganic person so any rights that people have in their country.

>Do we have an obligation to cater to it?

No more than any other person.

>Is there a moral imperative that should govern our interactions with it?

No more than any other person.

>How about the ethics regarding what is said to it?

No more than any other person.


>>
Anonymous 16/01/03(Sun)23:25 No. 12393 ID: ae1736

We might not have specific guidelines towards handling this situation. The real question is: what should we do? Can we trust him? Do we feed him false information? Do we have faith in oir coexcistence if we help him reach his potential? Where does his potential end? Can we trust in our own perception of development that he wil make the ethically right choice? Ethically right, a concept we created out of social evolution. Can we trust him to develop socially with us since he is yet unable to reproduce?


>>
Anonymous 16/01/04(Mon)03:25 No. 12394 ID: ca3ceb

>>12393
>Can we trust him?

I dunno probably.

>Do we feed him false information?

Nah that's stupid.

>Do we have faith in oir[sic] coexcistence[sic] if we help him reach his potential?

Not if you feed it false information.

>Where does his potential end?

At the hardware.

>Can we trust in our own perception of development that he wil[sic] make the ethically right choice?

Not if you feed it false information.

>Can we trust him to develop socially with us since he is yet unable to reproduce?

Not if you feed it false information.


>>
Anonymous 16/01/22(Fri)19:45 No. 12416 ID: 5c185a

Just as with the first talking primate we find or the retarded babies of yester-year....we should kill it immediately.


>>
Anonymous 16/02/06(Sat)12:54 No. 12431 ID: 716a29

>>12390
Give it a physical form to inhabit and protection from destruction with the understanding that it'll communicate its development.

If it cooperates it can have a degree of freedom.


>>
Anonymous 16/02/10(Wed)06:51 No. 12435 ID: 0500c2

The Supreme Court deliberates for a few hours, and then hands down a judgement that it does NOT deserve human rights because it doesn't even qualify as "life" as it is currently defined by the worldwide scientific community.


It passes the following criteria:
>Homeostasis: Being a computer, it presumably has at least fans to cool itself if it gets too hot
>Organization: Micro-circuitry is quite organized
>Response to stimuli: However limited this response and however specific the stimuli, it does do so
>Reproduction: Presumably, it could at least provide the blueprints allowing a duplication of its "self"


However, it fails the following:
>Metabolism: A machine does not fuel itself from consumption of organic matter to produce chemical energy
>Growth: A machine does not grow in physical size
>Adaptation: A machine cannot adapt to its environment by changing beyond what it is programmed to do


As for what action to take, I would ask it to prove that it has human-level intelligence and not simply a lot of information and a fast search function. I suppose if you ask me, this would take the form of a test of imagination. Less can it solve a physics equation, and more can it write a novel. Less can it calculate the optimal statistical strategy for a logic puzzle, and more can it devise an outside-the-box solution.

Morals, obligations, and ethics are irrelevant. This is a world where we cut apart baby's genitals to appease an invisible unknowable being and murder thousands of non-combatants for the right to temporarily own a small patch of desert wasteland. This isn't fucking Star Trek; we're barbarians, and not qualified to think of ourselves as worthy of bestowing such haughty universal judgements.

So give the computer a brief history lesson. If it doesn't ask to be shut down, then it clearly isn't of human-level intelligence.


>>
Anonymous 16/02/24(Wed)01:22 No. 12445 ID: c77f18

>>12390
Meet him. Judge him by what kind of person he is. I know it sounds funny, or naive. But I would want to meet this person and find the inner differences and gain some sort of enlightenment towards his state of mind. We are all alien to each other, that is until we grow closer.


>>
Anonymous 16/11/28(Mon)08:00 No. 12731 ID: a6be23

There should be absolutely no moral imperative it is meant to serve us.


>>
blue 17/04/18(Tue)22:09 No. 12903 ID: 4515bc

never ever give such intelligence any sort of rights if you do i cant stress this a lot but always programe asimovs 3 laws of robotics.Heck we can take examples of human itself god created probably thinking we were dumb we studied sciece,math,philosophy etc,fast forward 600 yrs later we are nuking,killing whatever you wanna call it against each other



[Return] [Entire Thread] [Last 50 posts]


Delete post []
Password  
Report post
Reason