Moderators: kiore, Blip, The_Metatron
Templeton wrote:So the belief is that cognition and consciousness are the same or that one derives from the other?
Is there a connection between brains that isn't physical?
Read this a few years ago
http://www.defense.gov/news/newsarticle.aspx?id=51091
US Military funding research on communication with brain waves.
What was that article about Microtubules
GrahamH wrote:SpeedOfSound wrote:And it is that model state that I want to draw your attention to in this query:
http://www.rationalskepticism.org/psych ... l#p1996572
I don't see the relevance. You are asking about the identity of qualia, when few, if any, discussing here think qualia are more than 'illusions'. Is that pareidolia face the exact same pareidolia face as this one? Who cares?
You might as well ask if the brain activity of a repeated action, such as catching a ball, is identical every time. The important question is whether / how the ball is caught, or whether the brain activity is interpreted as this or that class of experience.
SpeedOfSound wrote:GrahamH wrote:SpeedOfSound wrote:And it is that model state that I want to draw your attention to in this query:
http://www.rationalskepticism.org/psych ... l#p1996572
I don't see the relevance. You are asking about the identity of qualia, when few, if any, discussing here think qualia are more than 'illusions'. Is that pareidolia face the exact same pareidolia face as this one? Who cares?
You might as well ask if the brain activity of a repeated action, such as catching a ball, is identical every time. The important question is whether / how the ball is caught, or whether the brain activity is interpreted as this or that class of experience.
Well it's key to your understanding both thermostats and brains and this makes it obvious that you have the patience for neither.
If there is no thing that you experience in my scenario a and b then how is it that you think consciousness is worthy of discussing?
GrahamH wrote:SpeedOfSound wrote:GrahamH wrote:SpeedOfSound wrote:And it is that model state that I want to draw your attention to in this query:
http://www.rationalskepticism.org/psych ... l#p1996572
I don't see the relevance. You are asking about the identity of qualia, when few, if any, discussing here think qualia are more than 'illusions'. Is that pareidolia face the exact same pareidolia face as this one? Who cares?
You might as well ask if the brain activity of a repeated action, such as catching a ball, is identical every time. The important question is whether / how the ball is caught, or whether the brain activity is interpreted as this or that class of experience.
Well it's key to your understanding both thermostats and brains and this makes it obvious that you have the patience for neither.
If there is no thing that you experience in my scenario a and b then how is it that you think consciousness is worthy of discussing?
I have no particular objection to saying that 'every conscious moment is little different to every other', if you like.
I would reject a claim that no two experiences have anything at all in common, but otherwise put the slider where you like.
You haven't answered as to why your question is relevant. You merely did a kenny-style 'you don't understand' waffle.
What are the criteria for worthiness of internet discussion topics?
SpeedOfSound wrote:But think for a minute about an extension of the set of elements to exhaustively map a cognitive set. Think about a very short list of such elements to only partially map it.
SpeedOfSound wrote:Teuton wrote:GrahamH wrote:Do you think you have to taste a real lemon to experience a lemon taste? The minimum required would seem to be an appropriate neural stimulation. Who can say what limits must apply to generating that class of stimulation?
This question is irrelevant to my point that you cannot (come to) know the taste of lemon without undergoing a corresponding gustatory experience (however caused).
I like how you are both wrong at the same time.
GrahamH wrote:I'm not sure you are saying more than 'it's complicated', which is obvious.
Something connects one lemon taste to another, and to lemony tastes that are not due to lemons. If you want to wrap us the huge complexity of subtle variances in brain activity. I'm thinking of this as a host of fuzzy classifiers. Any stimulus will light up more than one class. Not every activate class need feature in experience. We can 'become aware of' stimuli that were already present and lose awareness of a stimulus previously attended to.
I think you want to wrap up all the complexity and claim it all as some integrated whole moment of consciousness. I'm far from convinced that makes any sense, or serves any need in explaining C.
GrahamH wrote:zoon wrote:I don’t agree with GrahamH when he says that humans are conscious because we model ourselves. I think that we attribute consciousness primarily to other people, and also to ourselves and to animals such as squirrels, and we may then go further and attribute it to such things as thermostats or glove puppets, or even to everything in sight (panpsychism). Consciousness isn’t something we have, it’s something we’ve evolved to think we have.
I'm not sure if we disagree here or not. I agree with your last line. Consciousness isn’t something we have, it’s something we’ve evolved to think we have. I think that to think we have it is to model a subjective self as locus of experience. The congnitive organ understands that there is a subject having experiences because that is the semantics it generates to understand its own function in relation to the world.
GrahamH wrote:We can trip over the language here, because thinking is a conscious thing, something experienced, so we need to distinguish the normal meaning of 'thinking' from the 'neural computation' that seemingly underlies the interpretation of the world and self.
GrahamH wrote:ToM can similarly be confusing in that it can be taken as a high level conceptual thing inextricably tied to consciousness, or a low level perceptual thing, like 'knowing how to walk' or 'knowing a familiar face', that take no conscious effort.
Ian Apperley wrote:In a recent paper with Stephen Butterfill I argue for a two-systems approach to everyday perspective-taking and social reasoning. We think it highly unlikely that a single psychological faculty for "theory of mind" could support both moment-by moment co-operation and competition, and the sort of complex psychological reasoning necessary for evaluating the guilt of the accused in a court of law. We suggest that these competing demands can only be satisfied by having distinct cognitive systems that make different trade-offs between flexibility and cognitive efficiency. Infants and some non-human animals have one or more cognitively efficient processes for "theory of mind", which also support adults' moment-by-moment social cognition. However, older children and adults have, in addition, a more flexible capacity for psychological reasoning that supports sophisticated judgements but makes heavy demands on memory and executive function. For us this sheds light on otherwise confusing patterns of success and failure in the theory of mind abilities of non-human animals, infants, children and adults. This line of thinking has motivated much of my recent empirical work.
kennyc wrote:The problem with Theory of Mind is that it presupposed consciousness. It is NOT consciousness nor does it have anything fundamental to do with consciousness, it is something extra that must be gotten rid of if consciousness is to be understood.
Gallagher and Frith wrote:Our ability to explain and predict other people's behaviour by attributing to them independent mental states, such as beliefs and desires, is known as having a ‘theory of mind’. Interest in this very human ability has engendered a growing body of evidence concerning its evolution and development and the biological basis of the mechanisms underpinning it. Functional imaging has played a key role in seeking to isolate brain regions specific to this ability. Three areas are consistently activated in association with theory of mind. These are the anterior paracingulate cortex, the superior temporal sulci and the temporal poles bilaterally. This review discusses the functional significance of each of these areas within a social cognitive network.
Koster-Hale and Saxe wrote:In the decade since the last edition of Understanding Other Minds , the number of papers that use human neuroimaging tools to investigate the neural basis of theory of mind (ToM) has exploded from four (described in Frith & Frith’s 2000 chapter) to, as of 2013, well over 400. Studying ToM with neuroimaging works. Unlike many aspects of higher-level cognition, which tend to produce small and highly variable patterns of responses across individuals and tasks, ToM tasks generally elicit activity in an astonishingly robust and reliable group of brain regions.
zoon wrote:GrahamH wrote:zoon wrote:I don’t agree with GrahamH when he says that humans are conscious because we model ourselves. I think that we attribute consciousness primarily to other people, and also to ourselves and to animals such as squirrels, and we may then go further and attribute it to such things as thermostats or glove puppets, or even to everything in sight (panpsychism). Consciousness isn’t something we have, it’s something we’ve evolved to think we have.
I'm not sure if we disagree here or not. I agree with your last line. Consciousness isn’t something we have, it’s something we’ve evolved to think we have. I think that to think we have it is to model a subjective self as locus of experience. The congnitive organ understands that there is a subject having experiences because that is the semantics it generates to understand its own function in relation to the world.
I agree with you that I think I have consciousness when one part of my brain sets up a model of a self having experiences; where I’m less in agreement is that I think the modelling of selves with experiences is aimed at other people at least as much as at one’s own self. I see consciousness as something which we evolved to attribute primarily to other people (because the Theory of Mind processes are useful for predicting other people), so that while one’s own conscious experience can feel far more direct and undeniable than another person’s (as Descartes said), it’s not in fact the starting point for attributing consciousness. (In post #552 I quoted an article which mentioned children learning about their own false beliefs at the same time as learning about others’ false beliefs). We attribute consciousness (or awareness, or mental states) where something triggers the Theory of Mind processes in our brains, especially when the ToM processes give a better prediction than treating it as a mechanism, so we see other people and animals as having consciousness even though many of them (such as very young children) do not, so far as we know, attribute consciousness to themselves.
zoon wrote:GrahamH wrote:ToM can similarly be confusing in that it can be taken as a high level conceptual thing inextricably tied to consciousness, or a low level perceptual thing, like 'knowing how to walk' or 'knowing a familiar face', that take no conscious effort.
ToM includes many different processes, some, like mirror neurones or perspective-taking, are low-level and automatic, while others, such as careful consideration about what someone else may know on some subject, are as high level as any part of our thinking. Prof Ian Apperley discusses a “two systems” approach to Theory of Mind in a 2009 paper which he links to here:Ian Apperley wrote:In a recent paper with Stephen Butterfill I argue for a two-systems approach to everyday perspective-taking and social reasoning. We think it highly unlikely that a single psychological faculty for "theory of mind" could support both moment-by moment co-operation and competition, and the sort of complex psychological reasoning necessary for evaluating the guilt of the accused in a court of law. We suggest that these competing demands can only be satisfied by having distinct cognitive systems that make different trade-offs between flexibility and cognitive efficiency. Infants and some non-human animals have one or more cognitively efficient processes for "theory of mind", which also support adults' moment-by-moment social cognition. However, older children and adults have, in addition, a more flexible capacity for psychological reasoning that supports sophisticated judgements but makes heavy demands on memory and executive function. For us this sheds light on otherwise confusing patterns of success and failure in the theory of mind abilities of non-human animals, infants, children and adults. This line of thinking has motivated much of my recent empirical work.
GrahamH wrote:...
TO contrast that view of information with the self-model, the latter not that the information in the model is conscious, but that the processing of the model constitutes the formation of semantic links and that it's the dynamic responsiveness of this semantic system about self that 'is conscious'.
DavidMcC wrote:GrahamH wrote:...
TO contrast that view of information with the self-model, the latter not that the information in the model is conscious, but that the processing of the model constitutes the formation of semantic links and that it's the dynamic responsiveness of this semantic system about self that 'is conscious'.
Don't you mean "...that is self-conscious"?
Return to Psychology & Neuroscience
Users viewing this topic: No registered users and 1 guest