"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
Written by: quiet
You say that if UTOR is true, we've never witnessed a program taking place - but that's absurd. Of course we have.
Written by: quiet
Let's suppose that you're right about this: that we've never witnessed a program taking place. In that case, your entire project is scuppered, because your theory takes some basic examples - programs running on hardware - and works up from them to a finished theory. IF your theory is correct, then the premisses on which the theory is built are false. Hence, again by reductio, the argument fails.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
ture na sig
Written by: quiet
I'd be interested to hear what you have to say about my syntax vs semantics challenge
Written by: quiet
I think that my objection holds even if you're using examples as steps on a ladder to understanding. Metaphorically, if you've never actually seen those steps, how can you use them to gain understanding? You're taking the example of a program running on a computer, abstracting away from that to form a model of 'the mind as a program', and then denying that you can see programs running in the first place. So when you say 'the mind is a program', what you mean by 'a program' is determined by the computer analogy. But if there's nothing to talk about in the computer example, then the whole analogy falls down. Does that make sense?
Written by: quiet
Here, you've misread me. My point is that 'consciousness' isn't equivalent to 'a program', it's equivalent to 'a running program', or 'an instantiated program'. Why? Because consciousness involves thinking about things, and thinking about things involves transformations: one state turning into another, or causing another to come about, etc. This is completely disanalagous to 'pi as the result of an iterative process . . . '.
I'm NOT saying that programs can't exist without causal activity (although I'm a bit sceptical about that, also). I am saying that you can't run a program without causal activity - at any rate, you can't run programs in 'the timeless unchanging mathematical realm' - AND that, furthermore, the mind isn't a program, it's a RUNNING program.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
Written by: quiet
incidentally - and I'll wait for your response to ^^ that post before getting into this in any detail - I *strongly* recommend that you take a look at:
J Searle, 'The Chinese Room' [do a google search for it, I'm sure you'll find either the original paper or a decent summary thereof]
there's also an amazing paper by Hilary Putnam which changed my mind about the whole AI business. if you buy externalism about mental content, then his paper is the final nail in the coffin for the Turing Test.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
Written by: quiet
Further problems (hang on: I've just remembered another killer point)
1. Programs are simply syntactic: they specify manipulations of symbols, such as A x B = B x A.
2. Syntax is independent of content.
but
3. The mind has content (that is, semantics)
hence
4. The program cannot describe the mind fully.
and therefore
5. The mind cannot be a mere program.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
Written by: quiet
Granted, it doesn't matter what it's running on: it could be a bit of squishy grey matter, a complex computer, or a bit of swiss cheese if you could set the swiss cheese up appropriately.
Written by: from UTOR site
What is consciousness?
One of the assumptions that the theory uses is that consciousness is basically information/data- that 'mind' is fundamentally a program.
This will be easier to accept for those of you who believe that artificial intelligence is, in theory, possible; similarly for those who believe that mind is solely a function of the material brain.
For those who believe that artificial intelligence is not possible, or that consciousness is not merely a function of the brain, it will be harder to accept.
Written by: quiet
Now I'm *not* claiming that syntax and semantics (that is to say, form and content, or symbols and meaning, etc) are mutually exclusive: the claim is just that syntax isn't sufficient for semantics. Syntax can't result in semantics, since syntax is just set of rules for manipulating formal symbols (like "p, if p then q, so q"). It doesn't tell you anything about the content of these symbols: 'p' and 'q' could mean pretty much anything. The point here is that the mind isn't just a product of brain states: it's a product of brain states interacting with the world in a certain way. That's how our thoughts get their content.
Written by: quiet
The point here is that the mind isn't just a product of brain states: it's a product of brain states interacting with the world in a certain way. That's how our thoughts get their content.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
Written by: quiet
iii) Put another way: the syntax takes an input, and generates an output. But the input and output are symbols, things like 1/0 or 'p' or 'q'. And these symbols could mean anything. Unless you fix what they mean, you don't have meanings - that is, you don't have semantics.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
"Moo," said the happy cow.
ture na sig
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
Laugh Often, Smile Much, Post lolcats Always
ture na sig
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
Written by: onewheeldave
Let's forget the 'running' program; consciousness is not a running program.
"Moo," said the happy cow.
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!
ture na sig
Our deepest fear is not that we are inadequate. Our deepest fear is that we are powerful beyond measure. It is our light, not our darkness that most frightens us. We ask ourself, who am I to be brilliant, gorgeous and talented? Who are you NOT to be?
ture na sig
"Moo," said the happy cow.
ture na sig
"You can't outrun Death forever.
But you can make the Bastard work for it."
--MAJOR KORGO KORGAR,
"Last of The Lancers"
AFC 32
Educate your self in the Hazards of Fire Breathing STAY SAFE!