Is Consciousness an Emergent Property of a Master Algorithm?

In summary, the speaker discusses the changes in the forum since their last visit and explains an emergent property related to consciousness that may explain why it is not perfectly reducible. They mention that this property has been recognized by scientists and philosophers and argue against the concept of "subjective experience" as it has a circular definition and does not have any real meaning. They also mention their own theories on consciousness and how they satisfy certain conditions, but do not explain the concept of "subjective experience".
  • #36
confutatis said:
Yes, you see that P-consciousness is missing from an explanation of A-consciousness. That would be correct. But a zombie would think he sees it to. You must keep in mind that, according to Chalmers, there's nothing a zombie may say or do that would reveal his zombieness, because everything a zombie says and does is the result of A-consciousness - including statements about P-consciousness!

Right. I understand that. But are we saying that a zombie can't think for himself? The whole point of defining a zombie this way seems to be to make it impossible for "other people" to differentiate a zombie from a non-zombie to illustrate a point about consciousness.


For the same reason you do: he doesn't see P-consciousness in it. Or, rather, the physical action of a zombie scanning the words of an explanation of A-consciousness causes the zombie to move his mouth and tongue and utter the phrase: "I don't see P-consciousness in it!".

Again, this implies a zombie doesn't think for himself. I didn't realize that we were assuming that consciousness is what allowed me to think, calculate and make decisions. If we are that's fine. I'll just need to come up with another word to describe people like Mentat who don't know what the color red is.

He thinks he has it but he doesn't.
I'm trying to understand why. The only reason I can fathom is that they have been defined as deterministic robots who are simply programmed to say the same things that conscious people say.

I think even Chalmers acknowledges that zombies would also eventually come up with a hard problem, except in their case it would be a pseudo-problem whereas in our case it's a real problem :smile:

If they are allowed to think for themselves, I don't see how this can be true. But they may not be defined that way in which case I can see how that's true and I just need to come up with another word.
 
Physics news on Phys.org
  • #37
Fliption said:
Well I have no idea what is being "worked on". I'm of the opinion that it cannot be solved with the current assumptions regardless of whether it's being worked on or not.

Yes, I do know what I'm referring to. When you ask "what is it?", what is it that you are looking for me to tell you? Are you looking for words that allow you to scientifically approach it? Don't you realize that my knowledge comes from experience and you asking me for words is like trying to explain the color red to a blind man?

What you are saying is that your own experience is more than can be explained under current assumptions, right? But you can't tell me what it is that remains unexplained?

Oh, btw, you can't explain the color red any better to a person capable of sight. You can only point out examples, which is what I asked you to do with regard to this thing which has eluded explanation but which definitely exists.

Are you suggesting that if you could step into the position of my PC when it is doing math calculations, that you would find it experiencing the act of doing math exactly as you do yourself?

Not at all. My method of processing is distinctly different, but it remains a "method of processing", nothing more.

This may be the case today but it does not necessarily have to be the case. Category labels can be useful if they are consistently defined. But this is not relevant to this topic.

No, but what is relevant is that we have the term before the definition. What I'm trying to tell you is that that is a terrible way to reason. The phenomenon is supposed to be understood as existing, distinct from other phenomena, before a word is assigned to it (because then, at least we'll know what "it" is, to which the word refers).

I think your answer to my question above will be critical to me understanding your point. I see a distinction between measuring the wave length of light and the experience of the color red.

I hate to pick at words (though, as you well know, I think it is necessary that the words be correct, so as to avoid the possibility of confusion), but I too see a difference between "measuring" a particular wavelength of light and experiencing the color. What I don't see is the difference between being stimulated by a particular wavelength of light, which you then process in terms of previous stimulations and remember, and "experiencing" a certain color. I don't see what's left to explain, and those things that I mention are all part of the "easy problem".

If you are saying that the experience of the color red does not exists then we have nothing else to talk about.

I never said that. There's a difference between equating experience with computation, and saying that the experience never happened at all.

However, if you are saying that the eye and brain can pick up light and based on wavelength computations present what I am referring to as the experience of red then this is fine.

"Present"? To whom?
 
  • #38
hypnagogue said:
It might be helpful here to introduce some terms: phenomenal consciousness and access consciousness (or P-consciousness and A-consciousness for short). Access consciousness, very roughly, is taken to be those aspects of consciousness that play a functional role: attention, verbal report, intentionality (about-ness), motoric activity, perceptual discrimination, and so on can be taken to be instances of A-consciousness. Phenomenal consciousness, again roughly, is taken to be those aspects of consciousness that are experiential: the redness of an object, the timbre of a musical note, and the felt texture of a smooth tile can be taken to be instances of P-consciousness.

I need a better definition of "P-consciousness", as you probably expected. "The redness of an object" is a matter of perceptual discrimination, is it not?

I am not sure if Mentat's position is that P-consciousness does not exist or if it is that P-consciousness is subsumed under A-consciousness (i.e., that it is impossible to have A-consciousness without P-consciousness); I would appreciate a response here from Mentat pinpointing which of these views he holds, as opponents of the hard problem often take positions that do not explicitly differentiate between the two. This will help clarify further discussion.

I just don't understand what P-consciousness means. Your assesment of opponents of the hard problem appears to hold true with me, since I don't so much think that P-consciousness doesn't exist, or that it is subsumed under A-consciousness. What I really think is that the term doesn't make sense.

I suppose I could say that, were you to give me a specific instance of what you'd consider P-consciousness, I'd show that it is really just A-consciousness. But, at the same time, to do so does seem to imply that P-consciousness doesn't exist at all.

I would also like to say something about zombies. There is a bit of a fallacy of thought going on here that is easy to slip into, and I have done it myself in the past (even if only half-jokingly). First, to frame zombies in the nomenclature above, a zombie is a being with A-consciousness identical to that of a normally functioning human, but still lacking P-consciousness altogether. Thus a zombie behaves identically to a normally functioning human, even though the first person view of the zombie is non-existent.

Hold on a second. While this is the best definition of "zombie" I've ever seen, it is also the one that lays bare the ridiculousness of the notion. There are specific neo-cortical activities (things that would fall under the category of A-consciousness, or so I'd suspect) which can fully explain having a first-person view of objective phenomena. Indeed, Dennett went into a lengthy evolutionary explanation of that very matter in Consciousness Explained.

So, being a "zombie" becomes having no P-consciousness, with which I have no problem, so long as we don't deny them any of the things that A-consciousness can be shown to entail - i.e. self-consciousness, emotion, intuition, creativity, memory, perceptual discrimination (in all of it's forms; i.e. noticing, and responding to, the difference between textures, colors, shapes, and sounds), and reasoning ability.

The problematic notion I'd like to address is that one who denies the hard problem is acting in a zombie-like way by refuting, in some manner, the problem of P-consciousness. This seems like a natural position to take, since a zombie presumably could not understand the hard problem on the basis of its lack of P-consciousness. However, strictly speaking, this position cannot follow since the zombie behaves identically to a normal human, including verbal reports indicating a belief of P-conscious qualities. Therefore a zombie could be just as much a proponent of the hard problem as an enemy of it. Indeed, if all zombies had systematic difficulties understanding the hard problem, then on average they would not have A-conscious properties identical to the average human, contradicting our intial definition.

Has it not occurred to you that I might have been right when I told Fliption that everyone is a zombie? Think about it. I'm clearly a zombie, since I could claim to have P-consciousness, but I can't explain it. This exact statement holds true for all of you, does it not?

This is a great complication, because it implies that if I were to suddenly become a zombie, my first person view would be dramatically different even though I could not know about it personally, let alone indicate it to others either directly or indirectly. I do not think that this defeats the hard problem, but rather it underscores its hardness by emphasizing the epistemic difficulties involved.

Or it shows that, if the hard problem exists at all, then we are all zombies.
 
  • #39
Fliption said:
Yes, this is the same question I'm asking I think. Judging from past conversation, I'm thinking the answer will be that they are one and the same thing.

Technically speaking, they can't be. One presumes that P-consciousness is a subset of A-consciousness and the other that P-consciousness does not exist altogether. That is, the former says that phenomenal redness exists in virtue of (say) computation, while the latter says that redness does not exist in the first place.

I don't follow this zombie clarification. I understand what you're saying. I just don't understand why you're saying it. The nature of the hard problem is one of explanation. Is this correct? The fact that consciousness cannot be reductively explained using the fundamental elements we currently assume.

Yes. I am just trying to be conceptually precise about the terms (specifically "zombie") that we are using. I still think the hard problem is a valid one.

I can understand this issue because I can compare my experience to the explanation and see that something is missing. I don't understand how a zombie scientist could ever find the explanation of his A-consciousness unsatisfactory.

The process of comparing conceptual tokens is subsumed under A-consciousness. A zombie may not have P-consciousness, but he still has second order beliefs that he does, and his beliefs are identical to a normal human's. (Belief here is used strictly in a functional sense, i.e. one's disposition to make certain verbal utterances, and does not refer to any experiential aspect of belief-- eg the subjective feelings associated with believing something.)

If we presume that zombies think that something is missing from a physically reductive explanation of consciousness to a greater degree than humans do on average, then we are assuming that

a) there is some overlap between P- and A-consciousness in humans, i.e. that at least some aspect of A-consciousness is causally related to some aspect of P-consciousness (otherwise there would be no discernable difference in the behavior of a human and a zombie), and
b) the part of a zombie's A-consciousness corresponding to this P/A overlap is missing in virtue of its lack of P-consciousness (thereby accounting for the difference in its behavior, i.e. failing to recognize the hard problem).

However, this contradicts our initial definition that a zombie's A-consciousness must be identical to its human counterpart. Therefore, it is not possible that a human acknowledges the hard problem and his zombie counterpart does not acknowledge it to the same degree. Zombie Chalmers believes in the hard problem just as vigorously as human Chalmers, and zombie Dennett is no more set against the hard problem than is human Dennett.
 
  • #40
You know, the more I think about it, the clearer it becomes to me that a zombie is simply one lacking a final destination for the stimuli entering his/her brain. As such s/he is simply an exception to the Cartesian Theater model. But, if this is so, then we are all, most definitely, zombies. The Cartesian Theater model has been shown to have no merit, and is sometimes even used as an epithet.

For those who don't know what's wrong with the Cartesian Theater model, it usually comes back to one question: What happens next?

If there really were a "center", wherein "experience" played itself out, who would be observing it? Would they be conscious? If so, would they not need to have an observer within their own brains? It goes on ad infinitum without ever getting any closer to explaining consciousness. Thus, it is discarded.

P.S. Forgive me, Fliption, for having resurrected the old homunculus problem. I remember I never explained it well enough for you to get what I meant, which led to many fruitless debates, but it just seemed necessary that I remove the Cartesian Theater, along with any theories that fall into the same trap.
 
  • #41
Mentat said:
What you are saying is that your own experience is more than can be explained under current assumptions, right? But you can't tell me what it is that remains unexplained?

I can tell you what remains to be explained. But the ability to communicate it to you is dependent on your ability to experience that which cannot be explained as well. If you do not experience it or you choose to deny that you experience it, then the only way I can tell you what it is is to explain it.

Oh, btw, you can't explain the color red any better to a person capable of sight. You can only point out examples, which is what I asked you to do with regard to this thing which has eluded explanation but which definitely exists.
Right. But the difference is that red is a visual subjective experience associated with a physical object, which allows me to point to something. Subjective experiences in general is not so easy to do. I can't point to anything. I can only attempt to reference it to your own personal experience by saying "it feels like something to be Mentat". Why should it feel like anything at all?

Not at all. My method of processing is distinctly different, but it remains a "method of processing", nothing more.

Since it is nothing but another method of processing then you should be able to explain and recreate the whole process.

No, but what is relevant is that we have the term before the definition. What I'm trying to tell you is that that is a terrible way to reason. The phenomenon is supposed to be understood as existing, distinct from other phenomena, before a word is assigned to it (because then, at least we'll know what "it" is, to which the word refers).
It may be a terrible way to reason but no one here is doing that. I DO KNOW what it is. I keep saying this. Why would I attach a word to something that has no meaning to me?

I hate to pick at words (though, as you well know, I think it is necessary that the words be correct, so as to avoid the possibility of confusion), but I too see a difference between "measuring" a particular wavelength of light and experiencing the color. What I don't see is the difference between being stimulated by a particular wavelength of light, which you then process in terms of previous stimulations and remember, and "experiencing" a certain color. I don't see what's left to explain, and those things that I mention are all part of the "easy problem".
How can you be so sure of what the easy problem is when you can't see the hard problem?

I never said that. There's a difference between equating experience with computation, and saying that the experience never happened at all.

I agree. I was just trying to understand which one you were proposing. I think Hypnagogue is asking the same thing.

"Present"? To whom?

You didn't answer the question. One problem at a time.
 
  • #42
hypnagogue said:
Technically speaking, they can't be. One presumes that P-consciousness is a subset of A-consciousness and the other that P-consciousness does not exist altogether. That is, the former says that phenomenal redness exists in virtue of (say) computation, while the latter says that redness does not exist in the first place.

Wait a minute. Why can't P-consciousness and A-consciousness be the same thing? (I'm not saying I think they are, I just think I'll be better able to understand them after this question is answered.)

Oh, btw, what is redness?

Yes. I am just trying to be conceptually precise about the terms (specifically "zombie") that we are using. I still think the hard problem is a valid one.

Inspite of the fact that you appear - until further clarification presents itself - to have shown yourself and Fliption to be zombies as well?

The process of comparing conceptual tokens is subsumed under A-consciousness. A zombie may not have P-consciousness, but he still has second order beliefs that he does, and his beliefs are identical to a normal human's. (Belief here is used strictly in a functional sense, i.e. one's disposition to make certain verbal utterances, and does not refer to any experiential aspect of belief-- eg the subjective feelings associated with believing something.)

Why not? From a purely A-consciousness PoV, belief is not just the disposition to make certain verbal utterances, but the disposition to rule in favor ("rulings", meaning simple computative processes of discrimination) of one idea over another, based on previous stimulations.

btw, you, too, may not have P-consciousness, and instead have but a second-order belief that you do. After all, you haven't really explained what it is, and that is the same predicament that a zombie would have, is it not?

If we presume that zombies think that something is missing from a physically reductive explanation of consciousness to a greater degree than humans do on average, then we are assuming that

a) there is some overlap between P- and A-consciousness in humans, i.e. that at least some aspect of A-consciousness is causally related to some aspect of P-consciousness (otherwise there would be no discernable difference in the behavior of a human and a zombie), and
b) the part of a zombie's A-consciousness corresponding to this P/A overlap is missing in virtue of its lack of P-consciousness (thereby accounting for the difference in its behavior, i.e. failing to recognize the hard problem).

However, this contradicts our initial definition that a zombie's A-consciousness must be identical to its human counterpart. Therefore, it is not possible that a human acknowledges the hard problem and his zombie counterpart does not acknowledge it to the same degree. Zombie Chalmers believes in the hard problem just as vigorously as human Chalmers, and zombie Dennett is no more set against the hard problem than is human Dennett.

So, I ask you again, what is the difference, after all that you've said here, between a zombie and everyone else?
 
  • #43
Mentat said:
What you are saying is that your own experience is more than can be explained under current assumptions, right? But you can't tell me what it is that remains unexplained?

I can tell you what remains to be explained. But the ability to communicate it to you is dependent on your ability to experience that which cannot be explained as well. If you do not experience it or you choose to deny that you experience it, then the only way I can tell you what it is is to explain it.

Oh, btw, you can't explain the color red any better to a person capable of sight. You can only point out examples, which is what I asked you to do with regard to this thing which has eluded explanation but which definitely exists.
Right. But the difference is that red is a visual subjective experience associated with a physical object, which allows me to point to something. Subjective experiences in general is not so easy to do. I can't point to anything. I can only attempt to reference it to your own personal experience by saying "it feels like something to be Mentat". Why should it feel like anything at all?

Not at all. My method of processing is distinctly different, but it remains a "method of processing", nothing more.

Since it is nothing but another method of processing then you should be able to explain and recreate the whole process.

No, but what is relevant is that we have the term before the definition. What I'm trying to tell you is that that is a terrible way to reason. The phenomenon is supposed to be understood as existing, distinct from other phenomena, before a word is assigned to it (because then, at least we'll know what "it" is, to which the word refers).
It may be a terrible way to reason but no one here is doing that. I DO KNOW what it is. I keep saying this. Why would I attach a word to something that has no meaning to me?

I hate to pick at words (though, as you well know, I think it is necessary that the words be correct, so as to avoid the possibility of confusion), but I too see a difference between "measuring" a particular wavelength of light and experiencing the color. What I don't see is the difference between being stimulated by a particular wavelength of light, which you then process in terms of previous stimulations and remember, and "experiencing" a certain color. I don't see what's left to explain, and those things that I mention are all part of the "easy problem".
How can you be so sure of what the easy problem is when you can't see the hard problem?

I never said that. There's a difference between equating experience with computation, and saying that the experience never happened at all.

I agree. I was just trying to understand which one you were proposing. I think Hypnagogue is asking the same thing.

"Present"? To whom?

You didn't answer the question. One issue at a time. It's much less confusing that way.
 
  • #44
Mentat said:
Has it not occurred to you that I might have been right when I told Fliption that everyone is a zombie? Think about it. I'm clearly a zombie, since I could claim to have P-consciousness, but I can't explain it. This exact statement holds true for all of you, does it not?
As I said to confutatis in another thread. This assumes that humans can objectively know everything that exists in reality.
 
  • #45
Fliption said:
I can tell you what remains to be explained. But the ability to communicate it to you is dependent on your ability to experience that which cannot be explained as well. If you do not experience it or you choose to deny that you experience it, then the only way I can tell you what it is is to explain it.

As per your first statement, I expect that you will give what explanation you have to give.

Right. But the difference is that red is a visual subjective experience associated with a physical object, which allows me to point to something. Subjective experiences in general is not so easy to do. I can't point to anything. I can only attempt to reference it to your own personal experience by saying "it feels like something to be Mentat". Why should it feel like anything at all?

First off, yes, the computation of "red" is a subjective one, I don't see how any computer could compute objectively.

Secondly, I don't expect you to point to something that is subjective, I expect you to give a definition. I can define "red". Can you define "subjective experience"?

Finally, it doesn't feel like anything to me, because I have nothing to compare it to...I've never been anyone else.

Since it is nothing but another method of processing then you should be able to explain and recreate the whole process.

Recreate! Surely you jest. As I explained at the very outset of the thread called "Faulty expectations of a theory of Consciousness", no scientific theory is ever expected to produce the phenomenon that it explains. It is merely expected to explain what it is and is not, when/under what circumstances it occurs, and then to be able to recreate the circumstances and prove that it does indeed occur under those circumstances.

So, since there is a certain computation that occurs whenever you are exposed to a certain wavelength of light, I need only explain which wavelength it is, which form of stimulation (and re-stimulation) is occurring in the pyrimidal neurons of your neocortex, under what conditions this occurs (which I have already established: whenever the wavelength is present and stimulates your retina, this computation occurs), and then reproduce the conditions (which I could easily do by, for example, turning my words red).

It may be a terrible way to reason but no one here is doing that. I DO KNOW what it is. I keep saying this. Why would I attach a word to something that has no meaning to me?

I wouldn't expect you to, but you have to relate what that meaning is, and I can't just trust that you have a meaning in mind.

How can you be so sure of what the easy problem is when you can't see the hard problem?

Simple, I read the piece by Chalmers which defined the "easy problem".

You didn't answer the question. One issue at a time. It's much less confusing that way.

What question? I looked back at the antecedent post, and the only question I'm seeing is my own (namely: to whom is this thing being "presented"? That is the term you used, and that is the term needs explanation).
 
  • #46
Fliption said:
As I said to confutatis in another thread. This assumes that humans can objectively know everything that exists in reality.

Only if you're bound to induction. If it can be deduced that there is nothing more to explain, after all the facets of A-consciousness have been explained - and that the definition of "zombie" is one having A-consciousness, but nothing more - then it can be logically concluded that everyone is a zombie.
 
  • #47
hypnagogue said:
However, this contradicts our initial definition that a zombie's A-consciousness must be identical to its human counterpart. Therefore, it is not possible that a human acknowledges the hard problem and his zombie counterpart does not acknowledge it to the same degree. Zombie Chalmers believes in the hard problem just as vigorously as human Chalmers, and zombie Dennett is no more set against the hard problem than is human Dennett.

But I don't see how this could ever exists. How could a zombie's A consciousness be identical when a human's A consciousness is connected somehow to P consciousness and a zombies is not? There has to be some difference somewhere, doesn't there?
 
  • #48
Mentat said:
Only if you're bound to induction. If it can be deduced that there is nothing more to explain, after all the facets of A-consciousness have been explained - and that the definition of "zombie" is one having A-consciousness, but nothing more - then it can be logically concluded that everyone is a zombie.

If you could deduce such things, yes. But you cannot.
 
  • #49
Fliption said:
If you could deduce such things, yes. But you cannot.

Sure I can. I can inductively or deductively prove the first proposition. The second stands as "accepted" since hypna posted it, and I find no fault with it. And the conclusion is valid, provided the premises are.
 
  • #50
Mentat said:
I can define "red". Can you define "subjective experience"?
You first. I don't think you can do it.

Finally, it doesn't feel like anything to me, because I have nothing to compare it to...I've never been anyone else.

I'm about to give up. Obviously you feel something. You don't refer to it as "what it feels like to be Mentat" for the reason you provided but that's just the way that I, as an outsider, would refer to whatever feeling you may have.

Recreate! Surely you jest. As I explained at the very outset of the thread called "Faulty expectations of a theory of Consciousness", no scientific theory is ever expected to produce the phenomenon that it explains. It is merely expected to explain what it is and is not, when/under what circumstances it occurs, and then to be able to recreate the circumstances and prove that it does indeed occur under those circumstances.

Obviously producing the phenomenon is not always something that can be done. I understand this. But we're not talking about creating a black hole here. We're talking about computation. Simple instructions like a software program. Just so you don't think I am debating the role of a scientific theory with you, I'm not saying that you have explained it and now have to produce it to qualify as a valid theory and convince me. I'm saying that you have not explained it and this is why you cannot produce it.

So, since there is a certain computation that occurs whenever you are exposed to a certain wavelength of light, I need only explain which wavelength it is, which form of stimulation (and re-stimulation) is occurring in the pyrimidal neurons of your neocortex, under what conditions this occurs (which I have already established: whenever the wavelength is present and stimulates your retina, this computation occurs), and then reproduce the conditions (which I could easily do by, for example, turning my words red).

This doesn't satisfy me. It doesn't explain what I am referring to. You should be able to produce what you have described very easily.

I wouldn't expect you to, but you have to relate what that meaning is, and I can't just trust that you have a meaning in mind.

I will have to solve the hard problem before I can make you understand it. Makes sense to me. You get to keep your view either way.

Simple, I read the piece by Chalmers which defined the "easy problem".

Since he did such a good job explaining it, why not read his explanation of the hard problem as well?

What question? I looked back at the antecedent post, and the only question I'm seeing is my own (namely: to whom is this thing being "presented"? That is the term you used, and that is the term needs explanation).

You were responding to the second option that I was giving you as part of a question. I think Hypnagogue is asking the same question and you may have answered it since then.
 
  • #51
Mentat said:
Sure I can. I can inductively or deductively prove the first proposition. The second stands as "accepted" since hypna posted it, and I find no fault with it. And the conclusion is valid, provided the premises are.
Ok fine, you can deduce it. But I cannot.
 
  • #52
Fliption said:
But I don't see how this could ever exists. How could a zombie's A consciousness be identical when a human's A consciousness is connected somehow to P consciousness and a zombies is not? There has to be some difference somewhere, doesn't there?

Congratulations! After much wrangling, you are finally beginning to see what's wrong with Chalmers' argument.

Apparently only two things can follow from Chalmers' definition of a zombie: either they can't possibly exist, as you realized, or we are all zombies, as Mentat says.

Where is that guy who said this discussion is merely about semantics? :smile:
 
  • #53
confutatis said:
Congratulations! After much wrangling, you are finally beginning to see what's wrong with Chalmers' argument.

Apparently only two things can follow from Chalmers' definition of a zombie: either they can't possibly exist, as you realized, or we are all zombies, as Mentat says.

Where is that guy who said this discussion is merely about semantics? :smile:

This is all true. But it isn't a semantic problem only, because none of this relevant. I have been using the zombie concept when I should have been using some other word. I personally don't see the signficance of the distinction hypnagogue has pointed out. It doesn't seem to me that the definition has to be this way to make the case that Chalmers is trying to make. My only beef with it is it means I have to find another word to call Mentat. The issue remains regardless of what I call it though.
 
Last edited:
  • #54
Mentat said:
I need a better definition of "P-consciousness", as you probably expected. "The redness of an object" is a matter of perceptual discrimination, is it not?

Again, I cannot precisely pick out the concept in words, but I can only point to it. When you look at a stop sign, what does it look like to you? Among its many apparent properties, it has a certain visual phenomenal quality that you call 'redness.'

Discrimination is clearly involved here (eg, discriminating the redness of the sign from the blueness of the sky), but discrimination alone does not exhaustively characterize this phenomenon. For instance, for a human there is something different about discriminating hues of color and pitches of tone. You may say that this difference is purely underpinned by computational differences, and that may be the case, but we are only trying here to point to instances of what we mean by P-consciousness, not explain them.

Let me put it another way. Imagine that one day you encounter a curious cognitive dissociation. Suddenly you can't see anything at all, that is, the world looks to you the same way it looked in the past when you would close your eyes. And yet, you can walk around just as well as you could before, and you can accurately describe the world (e.g. by telling someone "I see a red stop sign" when a red stop sign is placed at a distance before you) just as well as you could before. This would be a case of visual A-consciousness without visual P-consciousness.

I'm not claiming that this is possible in practice; indeed, I suspect it most probably is not. I am simply using this example to illustrate how we can conceptually delineate between A and P consciousness. Even if it turns out that they are one and the same thing, there still would seem to be the distinctive property that there are different aspects or viewpoints of that one thing.

I suppose I could say that, were you to give me a specific instance of what you'd consider P-consciousness, I'd show that it is really just A-consciousness. But, at the same time, to do so does seem to imply that P-consciousness doesn't exist at all.

If P-consciousness does not exist for you, then your personal experience of acting in the world would be the same as your current personal experience of deep sleep: i.e., you would have no personal experience at all. If you respond to this by saying that you would indeed have personal experience just in virtue of your A-consciousness as you acted in the world, then you would be acknowledging the existence of P-consciousness and adding some claims about its properties (eg it exists whenever certain A-conscious activities occur). This is not the same as denying its existence altogether.

So, being a "zombie" becomes having no P-consciousness, with which I have no problem, so long as we don't deny them any of the things that A-consciousness can be shown to entail - i.e. self-consciousness, emotion, intuition, creativity, memory, perceptual discrimination (in all of it's forms; i.e. noticing, and responding to, the difference between textures, colors, shapes, and sounds), and reasoning ability.

A-consciousness entails the behavioral characteristics of, say, sadness, but it doesn't entail the personal feeling of sadness. If there is no P-consciousness, then by definition there is no personal feeling of sadness. This is the familiar schism; A-consciousness speaks of 3rd person observable properties, whereas P-consciousness speaks of 1st person observable properties. To the extent that sadness is characterized by objectively observable behaviors and brain activities, it has an A-conscious aspect; and to the extent that it is characterized by particular subjective feelings, it has a P-conscious aspect. Similar remarks can be made about the other members of your list.

Has it not occurred to you that I might have been right when I told Fliption that everyone is a zombie? Think about it. I'm clearly a zombie, since I could claim to have P-consciousness, but I can't explain it. This exact statement holds true for all of you, does it not?

It doesn't follow that your failure to explain P-consciousness entails that you are a zombie. If I can't explain how weather works, that doesn't mean there is no weather.

I maintain that I am not a zombie in virtue of my P-consciousness. To make this claim I am forced to assume that there is indeed some kind of overlap or causal connection between my A-conscious utterances and my P-conscious perceptions (otherwise I would have no basis in saying that I know I am P-conscious). So, ultimately, our viewpoints are probably not as far apart as they might seem on the surface-- we both acknowledge some sort of deep connection between A and P. Where we mainly disagree is on the nature of P.
 
  • #55
Fliption said:
But I don't see how this could ever exists. How could a zombie's A consciousness be identical when a human's A consciousness is connected somehow to P consciousness and a zombies is not? There has to be some difference somewhere, doesn't there?

There are at least two possibilities for how it could be that some creature has A-consciousness identical to a human but no P-consciousness.

1) It could be that A is not nomologically sufficient to influence a human's P consciousness in the way that it does. (Nomological sufficiency refers to a sufficiency that obtains in our reality as a result of its contingent natural laws, and as such is a stronger constraint than logical sufficiency.) If this were the case, then even though some aspects of my A-consciousness might always be accompanied by P-consciousness, a creature could exist in our reality with an A-consciousness identical to mine, such that it would not have my P-consciousness.

Note that A-consciousness is ultimately a functional concept, so this possibility might allow that a computer with an A-consciousness identical to mine would not have P-consciousness even though it might not allow that a human with A-consciousness identical to mine would not have my P-consciousness.

2) It could be that A is not logically sufficient to influence a human's P consciousness in the way that it does. If this were the case, then even though it might be the case that any creature in our universe which has an A-consciousness identical to mine has at least some sort of P-consciousness, it could still be the case that in some metaphysically possible world with different laws of nature, a creature with my A-consciousness would have no P-consciousness at all. This is the scenario Chalmers likes to use: there could be some metaphysical world physically identical to ours, in which a creature physically identical to me (and thus with identical A-consciousness) still does not have P-consciousness.

I think I can pinpoint the difficulty you are facing. You are assuming that there is some aspect of a human's A-consciousness that depends upon the human's P-consciousness, and that the presence of the human's P-consciousness is necessary for his A-consciousness to act in the way that it does (eg, you are assuming that a human's conceptual acceptance of the hard problem, as born out by his behavior and verbal reports, is possible only if he has P-consciousness). I think this necessity is too strong a limit. I see why P interacting with A in this way would be sufficient to cause the human to behave as if he accepts the hard problem, but I don't see why it is necessary-- I think it is logically possible that a zombie have the proper brain activation such that he behaves as if he accepts the hard problem even without 'input' from P-consciousness.

Suppose human H enters a brain state B, indicating roughly his belief in the hard problem, as a result of his P-consciousness. It is logically possible that there exists some metaphysical zombie who has entered the same brain state as H by means other than input from P.
 
Last edited:
  • #56
confutatis said:
Apparently only two things can follow from Chalmers' definition of a zombie: either they can't possibly exist, as you realized, or we are all zombies, as Mentat says.

Neither follows, actually. It could be the case that if I build a computer functionally identical to me (eg with identical A-consciousness), it still might not be P-conscious. (Chalmers uses zombies that are physically identical to humans, but he places them in metaphysical worlds with contingent laws that are not identical to all the contingent laws of our world. He does not contend that a physical replica of a person in our world could possibly not be P-conscious.)

As for your second claim, we can note that if P-consciousness interacts with A-consciousness, this interaction may be sufficient, but not necessary, to produce utterances such as "I am seeing the color red."
 
  • #57
Fliption said:
How can you be so sure of what the easy problem is when you can't see the hard problem?

I have no intention of speaking for Mentat because he already gave his answer but, what I want to ask you, Fliption is: If you can't "see" the easy problem; what makes you believe there is even a hard problem without the fundamentals for its basis?

And I could also say "How can you be so sure of what the hard problem is when you can't see or explain the easy problem without verification of what either constitutes its parts? You could shirk this question forever both ways, neither will be explained unless you start off easy.
 
  • #58
hypnagogue said:
Suppose human H enters a brain state B, indicating roughly his belief in the hard problem, as a result of his P-consciousness. It is logically possible that there exists some metaphysical zombie who has entered the same brain state as H by means other than input from P.

Ok, I can accept this but to me it implies a zombie is deterministically a slave of external forces. I agree that a brain state stating a belief in P could happen without an actual P event but I just don't understand why this would ever happen. I can interject a state that I wish a computer program to be in as well. But if I don't purposefully interject this state and allow the program to run it's course, it would have no reason to casually come into such a state on it's own. This is why I say that such a creature would have to be a slave to external influences and have no casual logic in it's own actions. It doesn't do anything because of it's own calculations. It doesn't seem to think at all.

No matter. What word should I use to describe someone who denies the hard problem because they do not have consciousness and therefore can explain their cognitive existence easily with the reductive tools of science?
 
Last edited:
  • #59
hypnagogue said:
Neither follows, actually. It could be the case that if I build a computer functionally identical to me (eg with identical A-consciousness), it still might not be P-conscious.

Chalmers is not talking about computers, as you pointed out yourself. The point Chalmers makes is that P-consciousness is not required to explain A-consciousness. He bases his claim on the notion of a physically identical entity which exhibits identical A-consciousness but lacks P-consciousness. He doesn't base his claims on seemingly-conscious computers.

(Chalmers uses zombies that are physically identical to humans, but he places them in metaphysical worlds with contingent laws that are not identical to all the contingent laws of our world.)

I believe you are wrong about Chalmers, but if you are right then that claim is just ridiculous, as it would imply that the hard problem is only a problem in the zombie universe. I definitely don't think that's what Chalmers is saying.

As for your second claim, we can note that if P-consciousness interacts with A-consciousness, this interaction may be sufficient, but not necessary, to produce utterances such as "I am seeing the color red."

I didn't claim we may be zombies. All I said was that there's nothing in Chalmers' definition of what a zombie is that allows us to feel different from them. We believe we have P-consciousness and so do zombies. Exactly where is the difference? In the "fact" that we are right about our belief and the zombie is wrong? That doesn't make any sense.

(here's a paper by Chalmers in case people think I'm misrepresenting his position: http://jamaica.u.arizona.edu/~chalmers/papers/goldman.html )
 
Last edited by a moderator:
  • #60
Jeebus said:
I have no intention of speaking for Mentat because he already gave his answer but, what I want to ask you, Fliption is: If you can't "see" the easy problem; what makes you believe there is even a hard problem without the fundamentals for its basis?

I think you have misunderstood. I don't have an issue with understanding the easy problem. I just found it amusing that Mentat (who claims to not understand what the hard problem is all about) used the term "easy problem" as if he understood the distinction. Which he admittedly doesn't. When he labels a set of activities as "the easy problem", he can't be sure he is correct because he doesn't understand the hard problem.
 
Last edited:
  • #61
confutatis said:
Chalmers is not talking about computers, as you pointed out yourself. The point Chalmers makes is that P-consciousness is not required to explain A-consciousness. He bases his claim on the notion of a physically identical entity which exhibits identical A-consciousness but lacks P-consciousness. He doesn't base his claims on seemingly-conscious computers.

I know, I was merely stating a possible case where a zombie as I have defined it (with A-consciousness identical to a human but no P-consciousness) could possibly exist in this reality.

Chalmers' point is not so much that P need not be invoked to explain A, as it is that completely explaining A does not completely explain P.

I believe you are wrong about Chalmers, but if you are right then that claim is just ridiculous, as it would imply that the hard problem is only a problem in the zombie universe. I definitely don't think that's what Chalmers is saying.

I think you need to brush up on your Chalmers. :biggrin:

The Conceivability Argument

According to this argument, it is conceivable that there be a system that is physically identical to a conscious being, but that lacks at least some of that being's conscious states. Such a system might be a zombie: a system that is physically identical to a conscious being but that lacks consciousness entirely. It might also be an invert, with some of the original being's experiences replaced by different experiences, or a partial zombie, with some experiences absent, or a combination thereof. These systems will look identical to a normal conscious being from the third-person perspective: in particular, their brain processes will be molecule-for-molecule identical with the original, and their behavior will be indistinguishable. But things will be different from the first-person point of view. What it is like to be an invert or a partial zombie will differ from what it is like to be the original being. And there is nothing it is like to be a zombie.

There is little reason to believe that zombies exist in the actual world. But many hold that they are at least conceivable: we can coherently imagine zombies, and there is no contradiction in the idea that reveals itself even on reflection. As an extension of the idea, many hold that the same goes for a zombie world: a universe physically identical to ours, but in which there is no consciousness. Something similar applies to inverts and other duplicates.

From the conceivability of zombies, proponents of the argument infer their metaphysical possibility. Zombies are probably not naturally possible: they probably cannot exist in our world, with its laws of nature. But the argument holds that zombies could have existed, perhaps in a very different sort of universe. For example, it is sometimes suggested that God could have created a zombie world, if he had so chosen. From here, it is inferred that consciousness must be nonphysical. If there is a metaphysically possible universe that is physically identical to ours but that lacks consciousness, then consciousness must be a further, nonphysical component of our universe. If God could have created a zombie world, then (as Kripke puts it) after creating the physical processes in our world, he had to do more work to ensure that it contained consciousness.

- David Chalmers, http://jamaica.u.arizona.edu/~chalmers/papers/nature.html

The claim, nonetheless, is not ridiculous. It is not an ontological claim about what exists, but an epistemic claim about what we can know about consciousness. (edit: scratch that; as Chalmers uses it, it is an ontological argument, although it can be modified to be purely an epistemic argument.) If there could be some world physically identical to ours that contains humans without P-consciousness, this underscores our conceptual difficulties with explaining P-consciousness in this world, where we are accustomed to being able to explain almost anything with a physically reductive explanation. Thus the hard problem obtains in our universe, and metaphysical zombies are only used to illustrate this point.

I didn't claim we may be zombies. All I said was that there's nothing in Chalmers' definition of what a zombie is that allows us to feel different from them. We believe we have P-consciousness and so do zombies. Exactly where is the difference? In the "fact" that we are right about our belief and the zombie is wrong? That doesn't make any sense.

If I am looking at a stop sign and I say, "I am seeing redness," I am referring to a certain mental state of mine. If I close my eyes, generate no internal visual imagery, and then say "I am seeing redness," then clearly my mental state is not the same as it was beforehand, even if my utterance is. The referent of the utterance has changed.
 
Last edited by a moderator:
  • #62
After reading the link that Confutatis provided, I'm trying to figure out why this isn't paradoxical.


so that qualia don't seem to play a primary role in the process by which we ascribe qualia to ourselves!


he'll tell you that he thinks that Bob Dylan makes good music. How can this ability for self-ascription be explained? Clearly not by appealing to qualia, for Zombie Dave doesn't have any. The story will presumably have to be told in purely functional terms.

The claim is made here that qualia is simply "along for the ride". If this is true then I can understand why Hypnagogue says the zombie definition is what it is. And I would agree. But the problem I'm having is that I can't explain how someone could ever come to write an article such as this if not for the existence of the qualia itself. Am I misunderstanding this?

I actually agree with everything that is being said by Chalmers and Hypnagogue about zombies except for one thing. It makes sense to me that they could conceivably be identical in everyway except for the belief in the hard problem itself. That's where I'm still not seeing the possibility. I just can't imagine why a planet of nothing but zombies would ever have a "hard problem" to solve.
 
Last edited:
  • #63
Forgive me if I am repeating what others have said i only got to page 3 of the thread, but isn't the 'subjective experience' that philosophers are referring to mearly personal experience, and isn't personal experience what some thing feels like for you?
And arnt philosophers mearly saying that no matter how detailed you describe the systems and sub-systems of the brain. These could never enable (hyphoteticaly) someone who wasnt conscious to understand what it 'feels' like to be conscious?
kind of like the descriptions of components of a tv not being about articulate what its like to watch a episode of buffy the vampire slayer. (not that I've ever watched it *ahem)
 
  • #64
hypnagogue said:
I know, I was merely stating a possible case where a zombie as I have defined it (with A-consciousness identical to a human but no P-consciousness) could possibly exist in this reality.

Well, I don't think you can possibly have A-consciousness exactly like a human without being a lot like a human. But that is a side issue anyway.

I think you need to brush up on your Chalmers.

I have read most of the papers on Chalmers' web page (my job bores me to death). I still see a contradiction in his ideas, and from my perspective your quote shows it explicitly:

Zombies are probably not naturally possible: they probably cannot exist in our world, with its laws of nature.

So - probably - our laws of nature are enough to explain why we are not zombies. Isn't that correct?

But the argument holds that zombies could have existed, perhaps in a very different sort of universe.

A very different sort of universe with very different natural laws? Could be, but in that case the hard problem should be stated in terms of those very different natural laws, not in terms of our own. That's not what Chalmers does; he conceives of a very different universe, and then uses his knowledge of facts about that universe to make claims about our own. But how can he know anything about a universe where the natural laws are very different from our own? That's what I find ridiculous.

For example, it is sometimes suggested that God could have created a zombie world, if he had so chosen.

I have no issue with this but the problem, as Chalmers himself hints at, is the nagging feeling that that's exactly what God did, that we are all zombies according to the way Chalmers defines zombies. You have pointed the problem yourself, and now you seem to be overlooking it.

From here, it is inferred that consciousness must be nonphysical.

That is not the only option. It can also be inferred that consciousness must be an illusion. That's what Mentat infers, and I don't see anything wrong with his argument.

If God could have created a zombie world, then (as Kripke puts it) after creating the physical processes in our world, he had to do more work to ensure that it contained consciousness.

Let me show you a similar idea:

If God could have created a world without water, then after creating the physical processes in our world, he had to do more work to ensure that it contained water.

Surely you don't want to claim that water is a metaphysical substance, do you? We all know that our universe didn't have water in the beginning, but the existence of water can be fully explained by the same laws that explain the universe without water.
 
  • #65
Fliption said:
I want to make sure you're clear on my position because it doesn't appear you are.

I understand your position. You think the way Chalmers defines a zombie doesn't make much sense. And I agree.

I have an issue with the way zombie is being defined here for two reasons. 1) I currently see a problem with it and 2) I don't feel it needs to be defined this way to illustrate the point of the hard problem.

You are right about #1, but you are partially wrong about #2. The zombie thing is central to Chalmers' ideas. You may have a different concept of a hard problem which does not require zombies, but then your hard problem is not the same as Chalmers'.

From various discussions I've seen here, people are making too much out of the whole zombie topic and don't seem to understand the real point.

It's probably because people are not really talking about what you think they are talking about. Happens all the time in forums like this.

I said that zombies must not believe in the hard problem.

That's not what you said. Here's the quote from your post:

It makes sense to me that they could conceivably be identical in everyway except for the belief in the hard problem itself.

It's not clear from that sentence alone whether you think zombies do not believe in the hard problem, or if belief in the hard problem is what defines one as a zombie.

This is a non-sequitur

It's not a non-sequitur if you were talking about a definition. This is always a problem; it's hard to know if people are defining things and then stating what they think follows from their definitions, or if they are simply stating what they believe to be true. You have now made it clear that you were talking about your beliefs, not about definitions. And there's no point arguing about someone's beliefs without knowing how they came up with them. I don't know why you think zombies must not believe in the hard problem, since my belief is that zombies can't possibly exist.

I have a feeling that when I do get there, I'll be all alone. :frown:

I have found that the more I understand about the world and about myself, the harder it is to make people understand things the way I do. But for the most part I consider understanding as just a form of entertainment, so it doesn't bother me that people don't see things the way I do, because more likely than not everyone is wrong, including myself.

There are far more important things to do in life than "understanding". Only bored people waste time with philosophy.
 
Last edited by a moderator:
  • #66
confutatis said:
A very different sort of universe with very different natural laws? Could be, but in that case the hard problem should be stated in terms of those very different natural laws, not in terms of our own. That's not what Chalmers does; he conceives of a very different universe, and then uses his knowledge of facts about that universe to make claims about our own. But how can he know anything about a universe where the natural laws are very different from our own? That's what I find ridiculous.

The metaphysical universe in which these zombies live is physically identical to our own. Chalmers' implication, then, is that we cannot give a physical explanation of consciousness in our own universe-- if we could, it should apply equally well to our physically identical zombie counterparts, but it doesn't.

I have no issue with this but the problem, as Chalmers himself hints at, is the nagging feeling that that's exactly what God did, that we are all zombies according to the way Chalmers defines zombies. You have pointed the problem yourself, and now you seem to be overlooking it.

Chalmers' argument doesn't imply that we are zombies. It does highlight the difficult issue of our epistemic access to our P-consciousness, however: by what means can/do we have knowledge of P? It would seem on the face of it that we can't have access to it if we have the same access to everything a zombie does, but there are ways around this. For instance, it could be that P-consciousness is in some way sufficient, but not necessary, to induce the kind of activity in me that leads me to say "I see a blue sky." A zombie might be led to say the same thing, but by means of a somehow different process.

That is not the only option. It can also be inferred that consciousness must be an illusion. That's what Mentat infers, and I don't see anything wrong with his argument.

The problem with this line of thinking is that it still leaves the big questions unanswered. P-consciousness is all about appearances in the first place, so saying that it is an illusion does not get us anywhere. It is still just as mystifying how such an illusion could be illusory in the way that it is.

If God could have created a world without water, then after creating the physical processes in our world, he had to do more work to ensure that it contained water.

Surely you don't want to claim that water is a metaphysical substance, do you? We all know that our universe didn't have water in the beginning, but the existence of water can be fully explained by the same laws that explain the universe without water.

If there is a world physically identical to ours, then it follows from this that there is water. It does not seem to follow in the same way that in a world physically identical to ours, there are conscious beings. This is the key difference in the argument you seem to be overlooking.
 
  • #67
Fliption said:
The claim is made here that qualia is simply "along for the ride". If this is true then I can understand why Hypnagogue says the zombie definition is what it is. And I would agree. But the problem I'm having is that I can't explain how someone could ever come to write an article such as this if not for the existence of the qualia itself. Am I misunderstanding this?

You're not misunderstanding, I think. This a deep issue of our epistemic access to P-consciousness. It appears as if we make certain actions (such as saying "I see that the sky is blue") in virtue of our P-conscious contents/properties/attributes. If this is so, it does raise serious questions about the possible existence of a creature with an identical A-consciousness but a non-existent P.

In view of this quandary, we might be lead to claim that any system with an identical A-consciousness to some conscious human must have an identical P. This is a purely functionalist notion, and accepting it forces us to make some apparently wild claims. For instance, Ned Block has composed the thought experiment of the Chinese Gym, where each individual in China communicates to the others by means of a walkie talkie, such that their communications are functionally isomorphic to neurons sharing information in the brain. If we accept strictly that identical A-consciousness must imply identical P, then we must accept that if each member of this Chinese Gym took on the functional characteristics of each of your neurons, then it would be P-conscious in precisely the same way you are. Given the proper input, it would 'say' "I see that the sky is blue" precisely when you would say the same thing, and presumably the whole ensemble collectively would be seeing the same phenomenal sky as you do when you make this statement.

So, it appears that there are ridiculous claims to be made all around the board when it comes to consciousness. Of course, it could be that the Chinese Gym really is P-conscious as a collective, but it seems to be a serious violation of our intuition-- just as bad, perhaps, as supposing that there could be a creature with an identical A but not an identical P.

For my own part, I currently think a panpsychist ontology is the best way to navigate the issue, but it's difficult to arrive at some paradigm on this problem for too long before more pressing issues arise.

I actually agree with everything that is being said by Chalmers and Hypnagogue about zombies except for one thing. It makes sense to me that they could conceivably be identical in everyway except for the belief in the hard problem itself. That's where I'm still not seeing the possibility. I just can't imagine why a planet of nothing but zombies would ever have a "hard problem" to solve.

Perhaps if a race of creatures evolved with A-consciousness similar to our own but no P, they wouldn't ever make reference to something like a P. But I think this is the wrong way to conceive of the problem. Presumably such a race would not have an A identical, on average, to humans, so they would not be zombies in the true sense.

Perhaps a better way to think of it is to suppose that tomorrow, by some strange occurence, all humans on this planet retained their A but lost their P. Would we go on merrily talking about the blueness of the sky and so on? Well, that really depends on the nature of the relationship between A and P-- if P is necessary to get us to behave as if we have it, then there would soon be a drastic change in our collective As; if it is not necessary, then it could be the case that by some means, we continues to behave as if we had it even though we don't. Perhaps it would turn out this way naturally, in which case P would have to be epiphenomnal and causally inefficacious; or, perhaps it would take some wild scenario like aliens interfering with our brain patterns in order to induce us to continue acting as if we had it, in which case we could conclude that P is sufficient, but not necessary, to get us to behave as if we have it.
 
  • #68
hypnagogue said:
The metaphysical universe in which these zombies live is physically identical to our own.

I do have trouble reconciling your "physically identical to our own" above with your previous mention of "different laws of nature". My perception is that you are contradicting yourself; how can you have a physically identical universe with different laws of nature?

But I might be missing something, so I'll await further comments.

Chalmers' argument doesn't imply that we are zombies.

Chalmers clearly states (and I need no brushing up for this :smile:) that the only difference between ourselves and a zombie is the truth about beliefs about P-consciousness. As I said, the only essential difference is that the zombie belief that they have P-consciousness is false, while our belief that we have P-consciousness is true. Yet Chalmers also states that the zombie has no way to find out that his beliefs about consciousness are false.

Do you have a way to find out that your beliefs about consciousness are false? No? What makes you different from a zombie then?

The problem with this line of thinking is that it still leaves the big questions unanswered. P-consciousness is all about appearances in the first place, so saying that it is an illusion does not get us anywhere. It is still just as mystifying how such an illusion could be illusory in the way that it is.

And yet Chalmers' zombies do not have P-consciousness and still have the illusion of having it. Can you explain how that is possible, because to me it makes no sense at all.

If there is a world physically identical to ours, then it follows from this that there is water.

I meant identical in the sense that it is describe by the same natural laws. Our knowledge of physics and chemistry explains why there's so much water in Louisiana, and it also explains why there's so little water in Nevada.

It does not seem to follow in the same way that in a world physically identical to ours, there are conscious beings.

That is only because you are assuming, a priori, that the laws that explain the presence or absence of water are incapable of explaining the presence or absence of consciousness. It's a circular argument. But the contrary argument, that they can explain, is also circular. This is the key point so many people seem to be overlooking.
 
  • #69
confutatis said:
Keep up the good work Fliption, and eventually you'll see why this hard problem is nonsense.

I want to make sure you're clear on my position because it doesn't appear you are.

No, you're not misunderstanding anything. It's a crucial point in Chalmers' argument that nothing that zombies do or say can be used to imply that they lack P-consciousness.

I have an issue with the way zombie is being defined here for two reasons. 1) I currently see a problem with it and 2) I don't feel it needs to be defined this way to illustrate the point of the hard problem. So just to be clear, I do believe there is a hard problem. I just don't think this problematic definition of zombies is needed.

From various discussions I've seen here, people are making too much out of the whole zombie topic and don't seem to understand the real point. I see that Hypnagogue has even felt that he had to clear up some confusion about this being a thought exercise of epistemology and not ontology. The same message I found myself saying over and over again in other threads, you may recall.

Since neither Mentat nor myself believe in the hard problem itself, then we must be zombies. Just look at your words above, that's what they imply.
No they don't imply that. I said that zombies must not believe in the hard problem. That is not the same thing as saying that people who don't believe in the hard problem are zombies. There's a big difference. This is a Non-sequitur (affirming the consequent) logical fallacy.

You are getting there.
Perhaps. I have a feeling that when I do get there, I'll be all alone. :frown:
 
Last edited:
  • #70
confutatis said:
I do have trouble reconciling your "physically identical to our own" above with your previous mention of "different laws of nature". My perception is that you are contradicting yourself; how can you have a physically identical universe with different laws of nature?

The claim is that the physical laws of nature do not exhaustively represent all the laws of nature.

Do you have a way to find out that your beliefs about consciousness are false? No? What makes you different from a zombie then?

Presumably if a zombie were to be granted P-consciousness, he would notice the difference.

Once again, there are deep issues here about epistemic access. Is there a way to find out that a belief as of a certain subjective experience is false? I don't know. It is a difficult issue with no clear answer.

On some views, P-consciousness is incorrigible-- that is, it is impossible to be wrong about a belief as of a certain subjective experience. Does this imply that a zombie in Chalmers' sense must have P-consciousness since it believes it does? No, I think not, since perhaps the metaphysical difference in the zombie's world that allows him to be functionally identical to a human without having attendent P-consciousness also allows him to be incorrect about his beliefs about P-consciousness. If this were true, then the incorrigibility of P-consciousness would be a result of the contingent laws of our universe involved in granting us P-consciousness. On the other hand, some claim that P-consciousness in this reality itself is not incorrigible.

And yet Chalmers' zombies do not have P-consciousness and still have the illusion of having it. Can you explain how that is possible, because to me it makes no sense at all.

Such zombies have second order beliefs of P-consciousness without first order P-conscious contents, whereas we have both. Strictly speaking, the zombie is under no illusion at all, since there is no 1st person view for the zombie from which it can be illusioned. The zombie is under an illusion no more than a rock is under an illusion. It seems that to have an illusion in the first place, one must have some sort of subjective perspective with which to be aware of such an illusion.

I meant identical in the sense that it is describe by the same natural laws. Our knowledge of physics and chemistry explains why there's so much water in Louisiana, and it also explains why there's so little water in Nevada.

If a conglomeration of H2O molecules exists in the zombie world, it follows that there is water. If a conglomeration of neurons exists in the zombie world, it does not follow that there is P-consciousness.

That is only because you are assuming, a priori, that the laws that explain the presence or absence of water are incapable of explaining the presence or absence of consciousness. It's a circular argument. But the contrary argument, that they can explain, is also circular. This is the key point so many people seem to be overlooking.

It is not an a priori assumption, it is an a posteriori deduction from our apparent systematic failure to satisfactorily explain P-consciousness in physical terms. The deduction may or may not be false, but it is not something assumed at the outset-- it is a conclusion arrived upon.
 

Similar threads

  • Other Physics Topics
Replies
7
Views
2K
  • Quantum Interpretations and Foundations
Replies
2
Views
1K
  • General Discussion
2
Replies
51
Views
17K
Replies
7
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
2K
  • General Discussion
Replies
32
Views
9K
  • Programming and Computer Science
Replies
1
Views
1K
Replies
10
Views
2K
  • General Discussion
3
Replies
77
Views
15K
Replies
3
Views
1K
Back
Top