Clusterheadaches.com Message Board (http://www.clusterheadaches.com/cgi-bin/yabb/YaBB.cgi)
New Message Board Archives >> 2005 General Board Posts >> Just for thought...
(Message started by: Hirvimaki on Feb 23rd, 2005, 12:48pm)

Title: Just for thought...
Post by Hirvimaki on Feb 23rd, 2005, 12:48pm
Just for thought:

http://www.bmezine.com/news/pubring/20030816.html#beep

Hirvimaki-Isi

PS: I disagree with the author. If you like this sort of thing (I know most people won't even make it through the article...), you might consider John Searle's Chinese Room Argument. And for those that make it through enough of the article to be freaked out but don't wish to abuse themselves any further by trying to read the Chinese Room Argument or the counter arguments, Searle simplified it thus:

1. Programs are purely formal (syntactic).
2. Human minds have mental contents (semantics).
3. Syntax by itself is neither constitutive of, nor sufficient for, semantic content.
4. Therefore, programs by themselves are not constitutive of nor sufficient for minds.

While there are shades of intelligence, there aren't shades of sentience. You either are or you aren't. A person is no longer sentient when he/she can no longer conceive of his/her own existence. If a computer could conceive of it's own existence then it could be considered sentient. But this is a far leap from intelligence.

Title: Re: Just for thought...
Post by nani on Feb 23rd, 2005, 12:59pm
[smiley=gocrazy.gif]

Trying to read that stuff gives me a headache.  ;;D

Some of you guys are just too smart.   :)
Thanks for explaining it.

Title: Re: Just for thought...
Post by Langa on Feb 23rd, 2005, 1:00pm

on 02/23/05 at 12:48:55, Hirvimaki wrote:
Just for thought:

http://www.bmezine.com/news/pubring/20030816.html#beep

Hirvimaki-Isi

PS: I disagree with the author. If you like this sort of thing (I know most people won't even make it through the article...), you might consider John Searle's Chinese Room Argument. And for those that make it through enough of the article to be freaked out but don't wish to abuse themselves any further by trying to read the Chinese Room Argument or the counter arguments, Searle simplified it thus:

1. Programs are purely formal (syntactic).
2. Human minds have mental contents (semantics).
3. Syntax by itself is neither constitutive of, nor sufficient for, semantic content.
4. Therefore, programs by themselves are not constitutive of nor sufficient for minds.

While there are shades of intelligence, there aren't shades of sentience. You either are or you aren't. A person is no longer sentient when he/she can no longer conceive of his/her own existence. If a computer could conceive of it's own existence then it could be considered sentient. But this is a far leap from intelligence.



I'm going to have to look into this when I get home Hirvi...

You know you brilliant people think we're all the same... ;;D

Langa

Title: Re: Just for thought...
Post by Hirvimaki on Feb 23rd, 2005, 1:07pm
FYI, Moore's Law is misstated. Moore's Law stated [originally] that computing power doubles every 12 months, not two years. It was revised in the early 1970s to 18 months, where it remains true through today.

Hirvimaki-Isi

Title: Re: Just for thought...
Post by Langa on Feb 23rd, 2005, 1:10pm

on 02/23/05 at 13:07:56, Hirvimaki wrote:
FYI, Moore's Law is misstated. Moore's Law stated [originally] that computing power doubles every 12 months, not two years. It was revised in the early 1970s to 18 months, where it remains true through today.

Hirvimaki-Isi


Can't you just read a romance novel like the rest of us?  ;)

Seriously though, i'm intrigued.

Langa

Title: Re: Just for thought...
Post by Linda_Howell on Feb 23rd, 2005, 2:03pm


I believe that my headaches have fried what was left of my brain.   I couldn't even get through your simplified version.    :-/


Linda

Title: Re: Just for thought...
Post by LeLimey on Feb 23rd, 2005, 2:08pm
WHOOOOOOOOOOOOSH!!!



That was the sound of your post going right over my head Hirv!
I was starting to feel clever too 'cos I worked out how to match the + and - on the batteries to the symbols on my camera too....

Hope you are happy now!!!  [smiley=laugh.gif]

Title: Re: Just for thought...
Post by ozzy on Feb 23rd, 2005, 2:29pm
Funny,

So in order to avoid an " I, Robot ---> Matrix " scenario, we should tattoo, pierce , have sex and add a dose of "Dark angel" (humans with animal DNA)....that is the way to prevent our extinction.....


LOL


Ozzy


Title: Re: Just for thought...
Post by john_d on Feb 23rd, 2005, 2:42pm

on 02/23/05 at 12:48:55, Hirvimaki wrote:
Just for thought:

http://www.bmezine.com/news/pubring/20030816.html#beep

Hirvimaki-Isi

PS: I disagree with the author. If you like this sort of thing (I know most people won't even make it through the article...), you might consider John Searle's Chinese Room Argument. And for those that make it through enough of the article to be freaked out but don't wish to abuse themselves any further by trying to read the Chinese Room Argument or the counter arguments, Searle simplified it thus:

1. Programs are purely formal (syntactic).
2. Human minds have mental contents (semantics).
3. Syntax by itself is neither constitutive of, nor sufficient for, semantic content.
4. Therefore, programs by themselves are not constitutive of nor sufficient for minds.

While there are shades of intelligence, there aren't shades of sentience. You either are or you aren't. A person is no longer sentient when he/she can no longer conceive of his/her own existence. If a computer could conceive of it's own existence then it could be considered sentient. But this is a far leap from intelligence.


Your assumption that there are no shades of sentience is arguable.  Of course, it's philosphy so it is all arguable.  But buying that assumtion, maybe the leap into sentience would be the catalyst into intelligence.  Small donation of food for these thoughts.

Title: Re: Just for thought...
Post by Hirvimaki on Feb 23rd, 2005, 2:46pm

on 02/23/05 at 13:10:56, Langa wrote:
Can't you just read a romance novel like the rest of us?

Er, you mean those "books" with words like tumescent and bossom and that feature characters with names like Reginald and Clarissa? No, thank you.

Give me pr0n or give me a good essay on the contrast between intrinsic intentionality and derived intentionality... ;)

A.N. Roquelaure excepted, of course.

Hirvimaki-Isi

Title: Re: Just for thought...
Post by Hirvimaki on Feb 23rd, 2005, 2:47pm

on 02/23/05 at 14:29:34, ozzy wrote:
Funny,

So in order to avoid an " I, Robot ---> Matrix " scenario, we should tattoo, pierce , have sex and add a dose of "Dark angel" (humans with animal DNA)....that is the way to prevent our extinction.....


LOL


Ozzy

Hilarious, no?

Hirvimaki-Isi

Title: Re: Just for thought...
Post by sandie99 on Feb 24th, 2005, 1:20am
Too early in the morning for me... I'll get back at ya later. But it was interesting, though. And bit, hmmm, scary.

Title: Re: Just for thought...
Post by Gator on Feb 24th, 2005, 5:30pm
Interesting read.  Sounds like a prequel to the Terminator movies.  I can see where some of his fears may be justified.  I've wondered myself how long it would take for the first computer to see itself as alive.  Computers are being programmed to think in abstract terms, to draw subjective conclusions rather than simple logical selections.  What do we do when computers start asking the same questions that puzzle man?  Who am I?  Why am I here?  What is the meaning of life?  Once this happens, would it be  morally acceptable to pull the plug?

I liked this question.


Quote:
But we're like a car driving at a hundred and fifty miles an hour in the dark without headlights — don't you think it might be a good idea to put on our seatbelts?


At this speed, literally or figuratively, seatbelts wouldn't make a damn bit of difference.


Title: Re: Just for thought...
Post by Tiannia on Feb 24th, 2005, 6:17pm

on 02/24/05 at 17:30:01, Gator wrote:
At this speed, literally or figuratively, seatbelts wouldn't make a damn bit of difference.


I am rather proud of myself have not read sonething that in depth in many years.  But I have at agree with Gator.  There is change that we are headed to.   There is nothing that says that this is the outcome that we are going to hit, but there is always that chance.  Now do we have the right to "pull the plug" no... but then we are looking at the human race creating a life form.  Sci-fi'ers will jump to Battlestar Gallactica future, where our creation turns on us and tries to exterminate us.  But what justifiys life and what is truely alive.  we do not believe that many animals can thing abstractly yet we dont say that they are not alive.....  Interesting thoughts.

Title: Re: Just for thought...
Post by notseinfeld on Feb 25th, 2005, 4:40pm

Quote:
FYI, Moore's Law is misstated. Moore's Law stated [originally] that computing power doubles every 12 months, not two years. It was revised in the early 1970s to 18 months, where it remains true through today.

Hirvimaki-Isi


Since the advent of the internet (post BBS) the subculture geeksquad were forever quoting the Jumping Jesus theory. Cutting through all the crap essentially they meant That from the Industrial Revolution +50 years information doubled twice, then twice again in half the time, until finally, by the late 80's, information was literally tripping over itself until such time as implosion became inevitable. Interestingly, I do not see such guys around now, 20 years after the information overload should have crippled us all.

Speaking of things, I don't see much about that group that hooked on to that star in their spaceship after killing themselves. Anyone?

Title: Re: Just for thought...
Post by Charlie on Feb 25th, 2005, 9:25pm
http://www.netsync.net/users/charlies/gifs/Clearly End SIGN.png

Charlie: http://www.netsync.net/users/charlies/gifs/spacedog.gif



Clusterheadaches.com Message Board » Powered by YaBB 1 Gold - SP 1.3.1!
YaBB © 2000-2003. All Rights Reserved.