Quantified Consciousness - Michio Kaku

Studies of mental functions, behaviors and the nervous system.

Moderators: kiore, Blip, The_Metatron

Re: Quantified Consciousness - Michio Kaku

#761  Postby Templeton » May 06, 2014 4:34 pm

So the belief is that cognition and consciousness are the same or that one derives from the other?

Is there a connection between brains that isn't physical?

:popcorn:

Read this a few years ago

http://www.defense.gov/news/newsarticle.aspx?id=51091

US Military funding research on communication with brain waves.

What was that article about Microtubules :ask:
Templeton
 
Posts: 473

Country: USA
United States (us)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#762  Postby GrahamH » May 06, 2014 4:55 pm

Templeton wrote:So the belief is that cognition and consciousness are the same or that one derives from the other?

Is there a connection between brains that isn't physical?

:popcorn:

Read this a few years ago

http://www.defense.gov/news/newsarticle.aspx?id=51091

US Military funding research on communication with brain waves.

What was that article about Microtubules :ask:


The idea I've been exploring (for a long time!) is that cognition is the basis of all 'understanding' - if I understand something it is because my brain has built some sort of classifier that responds to and inteconnects those patterns of neural events. Loosely put - the cognitive brain understands the world (no appeal to consciousness) and part of it's understanding is of it's own operation in low-level semantics of experiencing self. Sensation is mapped to a body model, dispositions to act in certain ways may be mapped to a 'model of mind' as intent / desire etc. Graziano uses the term 'model of attention' which refers to experiential content coming and going ( even if stimulus remains constant ).

So, cognition is the mechanism of 'information processing' in brains and some of that information is about a self having experiences. Nascent thoughts are generated outside consciousness and some 'come to mind' by unconscious cognition.

Is there a non-physical link between brains? There doesn't seem to be much evidence for it. Obviously a brain radio link is physical.
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#763  Postby SpeedOfSound » May 06, 2014 6:21 pm

GrahamH wrote:
SpeedOfSound wrote:And it is that model state that I want to draw your attention to in this query:
http://www.rationalskepticism.org/psych ... l#p1996572


I don't see the relevance. You are asking about the identity of qualia, when few, if any, discussing here think qualia are more than 'illusions'. Is that pareidolia face the exact same pareidolia face as this one? Who cares?
You might as well ask if the brain activity of a repeated action, such as catching a ball, is identical every time. The important question is whether / how the ball is caught, or whether the brain activity is interpreted as this or that class of experience.

Well it's key to your understanding both thermostats and brains and this makes it obvious that you have the patience for neither.

:scratch: If there is no thing that you experience in my scenario a and b then how is it that you think consciousness is worthy of discussing?
User avatar
SpeedOfSound
RS Donator
 
Posts: 32093
Age: 73
Male

Kyrgyzstan (kg)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#764  Postby GrahamH » May 06, 2014 6:27 pm

SpeedOfSound wrote:
GrahamH wrote:
SpeedOfSound wrote:And it is that model state that I want to draw your attention to in this query:
http://www.rationalskepticism.org/psych ... l#p1996572


I don't see the relevance. You are asking about the identity of qualia, when few, if any, discussing here think qualia are more than 'illusions'. Is that pareidolia face the exact same pareidolia face as this one? Who cares?
You might as well ask if the brain activity of a repeated action, such as catching a ball, is identical every time. The important question is whether / how the ball is caught, or whether the brain activity is interpreted as this or that class of experience.

Well it's key to your understanding both thermostats and brains and this makes it obvious that you have the patience for neither.

:scratch: If there is no thing that you experience in my scenario a and b then how is it that you think consciousness is worthy of discussing?


I have no particular objection to saying that 'every conscious moment is little different to every other', if you like.
I would reject a claim that no two experiences have anything at all in common, but otherwise put the slider where you like.
You haven't answered as to why your question is relevant. You merely did a kenny-style 'you don't understand' waffle.

What are the criteria for worthiness of internet discussion topics? :roll:
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#765  Postby SpeedOfSound » May 06, 2014 6:39 pm

GrahamH wrote:
SpeedOfSound wrote:
GrahamH wrote:
SpeedOfSound wrote:And it is that model state that I want to draw your attention to in this query:
http://www.rationalskepticism.org/psych ... l#p1996572


I don't see the relevance. You are asking about the identity of qualia, when few, if any, discussing here think qualia are more than 'illusions'. Is that pareidolia face the exact same pareidolia face as this one? Who cares?
You might as well ask if the brain activity of a repeated action, such as catching a ball, is identical every time. The important question is whether / how the ball is caught, or whether the brain activity is interpreted as this or that class of experience.

Well it's key to your understanding both thermostats and brains and this makes it obvious that you have the patience for neither.

:scratch: If there is no thing that you experience in my scenario a and b then how is it that you think consciousness is worthy of discussing?


I have no particular objection to saying that 'every conscious moment is little different to every other', if you like.
I would reject a claim that no two experiences have anything at all in common, but otherwise put the slider where you like.
You haven't answered as to why your question is relevant. You merely did a kenny-style 'you don't understand' waffle.

What are the criteria for worthiness of internet discussion topics? :roll:


The point is that even thought the HSB is the same in the two instances what you experience is 99.999999....% different. So in what does the difference consist? Do you think it possible to create a finite list of reasonable length that would describe the difference sufficiently to re-imagine the experience?

You probably do. And this is where we differ. I believe it's all cognition but that cognition is a vast sea of furry little elements that cannot be specified. One is tempted to call it a qualia or a 'what it is like'. This is what having a complex brain with all these systems does to us. It feels just like we are biological organisms made of trillions of cells. There is no unconscious part because there is actually no conscious twin of such a thing. At least not in the way it would have to be if we could separate the two.
User avatar
SpeedOfSound
RS Donator
 
Posts: 32093
Age: 73
Male

Kyrgyzstan (kg)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#766  Postby SpeedOfSound » May 06, 2014 6:40 pm

But think for a minute about an extension of the set of elements to exhaustively map a cognitive set. Think about a very short list of such elements to only partially map it.
User avatar
SpeedOfSound
RS Donator
 
Posts: 32093
Age: 73
Male

Kyrgyzstan (kg)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#767  Postby GrahamH » May 06, 2014 7:15 pm

I'm not sure you are saying more than 'it's complicated', which is obvious.
Something connects one lemon taste to another, and to lemony tastes that are not due to lemons. If you want to wrap us the huge complexity of subtle variances in brain activity. I'm thinking of this as a host of fuzzy classifiers. Any stimulus will light up more than one class. Not every activate class need feature in experience. We can 'become aware of' stimuli that were already present and lose awareness of a stimulus previously attended to.

I think you want to wrap up all the complexity and claim it all as some integrated whole moment of consciousness. I'm far from convinced that makes any sense, or serves any need in explaining C.
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#768  Postby GrahamH » May 06, 2014 7:16 pm

SpeedOfSound wrote:But think for a minute about an extension of the set of elements to exhaustively map a cognitive set. Think about a very short list of such elements to only partially map it.


I'm not keen on the term 'cognitive set'. I think of it more as a swarm of activity, not a literally integrated whole.
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#769  Postby kennyc » May 06, 2014 7:16 pm

SpeedOfSound wrote:
Teuton wrote:
GrahamH wrote:Do you think you have to taste a real lemon to experience a lemon taste? The minimum required would seem to be an appropriate neural stimulation. Who can say what limits must apply to generating that class of stimulation?


This question is irrelevant to my point that you cannot (come to) know the taste of lemon without undergoing a corresponding gustatory experience (however caused).

I like how you are both wrong at the same time.


:cheers:
Kenny A. Chaffin
Art Gallery - Photo Gallery - Writing&Poetry
"Strive on with Awareness" - Siddhartha Gautama
User avatar
kennyc
THREAD STARTER
 
Name: Kenny A. Chaffin
Posts: 8698
Male

Country: U.S.A.
United States (us)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#770  Postby SpeedOfSound » May 06, 2014 9:10 pm

GrahamH wrote:I'm not sure you are saying more than 'it's complicated', which is obvious.
Something connects one lemon taste to another, and to lemony tastes that are not due to lemons. If you want to wrap us the huge complexity of subtle variances in brain activity. I'm thinking of this as a host of fuzzy classifiers. Any stimulus will light up more than one class. Not every activate class need feature in experience. We can 'become aware of' stimuli that were already present and lose awareness of a stimulus previously attended to.

I think you want to wrap up all the complexity and claim it all as some integrated whole moment of consciousness. I'm far from convinced that makes any sense, or serves any need in explaining C.


Unfortunately it is complex. That's why people was eloquent about what it is like. Once you accept that complexity and many-systemed-fully embodied approach the rest becomes simply 'all of the details'.

Now you DO NOT KNOW that all of the things in your brain may act to mediate each moment of conscious experience. So why do you assume it and declare it as fact?
User avatar
SpeedOfSound
RS Donator
 
Posts: 32093
Age: 73
Male

Kyrgyzstan (kg)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#771  Postby GrahamH » May 06, 2014 10:02 pm

What do you think I am 'declaring as fact'?
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#772  Postby GrahamH » May 09, 2014 7:58 am



Searle saying some relevant and some silly things about consciousness.

Error 1: Descartes was indisputably right with 'I think therefore I am'.
Error 2: Computation is purely syntax devoid of semantics (semantics are the product of consciousness).
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#773  Postby GrahamH » May 09, 2014 8:02 am



Kurzweil "How To Create A Mind"
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#774  Postby zoon » May 09, 2014 5:36 pm

GrahamH wrote:
zoon wrote:I don’t agree with GrahamH when he says that humans are conscious because we model ourselves. I think that we attribute consciousness primarily to other people, and also to ourselves and to animals such as squirrels, and we may then go further and attribute it to such things as thermostats or glove puppets, or even to everything in sight (panpsychism). Consciousness isn’t something we have, it’s something we’ve evolved to think we have.


I'm not sure if we disagree here or not. I agree with your last line. Consciousness isn’t something we have, it’s something we’ve evolved to think we have. I think that to think we have it is to model a subjective self as locus of experience. The congnitive organ understands that there is a subject having experiences because that is the semantics it generates to understand its own function in relation to the world.

I agree with you that I think I have consciousness when one part of my brain sets up a model of a self having experiences; where I’m less in agreement is that I think the modelling of selves with experiences is aimed at other people at least as much as at one’s own self. I see consciousness as something which we evolved to attribute primarily to other people (because the Theory of Mind processes are useful for predicting other people), so that while one’s own conscious experience can feel far more direct and undeniable than another person’s (as Descartes said), it’s not in fact the starting point for attributing consciousness. (In post #552 I quoted an article which mentioned children learning about their own false beliefs at the same time as learning about others’ false beliefs). We attribute consciousness (or awareness, or mental states) where something triggers the Theory of Mind processes in our brains, especially when the ToM processes give a better prediction than treating it as a mechanism, so we see other people and animals as having consciousness even though many of them (such as very young children) do not, so far as we know, attribute consciousness to themselves.

GrahamH wrote:We can trip over the language here, because thinking is a conscious thing, something experienced, so we need to distinguish the normal meaning of 'thinking' from the 'neural computation' that seemingly underlies the interpretation of the world and self.

I can only agree that tripping over language is an occupational hazard here.

GrahamH wrote:ToM can similarly be confusing in that it can be taken as a high level conceptual thing inextricably tied to consciousness, or a low level perceptual thing, like 'knowing how to walk' or 'knowing a familiar face', that take no conscious effort.

ToM includes many different processes, some, like mirror neurones or perspective-taking, are low-level and automatic, while others, such as careful consideration about what someone else may know on some subject, are as high level as any part of our thinking. Prof Ian Apperley discusses a “two systems” approach to Theory of Mind in a 2009 paper which he links to here:

Ian Apperley wrote:In a recent paper with Stephen Butterfill I argue for a two-systems approach to everyday perspective-taking and social reasoning. We think it highly unlikely that a single psychological faculty for "theory of mind" could support both moment-by moment co-operation and competition, and the sort of complex psychological reasoning necessary for evaluating the guilt of the accused in a court of law. We suggest that these competing demands can only be satisfied by having distinct cognitive systems that make different trade-offs between flexibility and cognitive efficiency. Infants and some non-human animals have one or more cognitively efficient processes for "theory of mind", which also support adults' moment-by-moment social cognition. However, older children and adults have, in addition, a more flexible capacity for psychological reasoning that supports sophisticated judgements but makes heavy demands on memory and executive function. For us this sheds light on otherwise confusing patterns of success and failure in the theory of mind abilities of non-human animals, infants, children and adults. This line of thinking has motivated much of my recent empirical work.
User avatar
zoon
 
Posts: 3302

Print view this post

Re: Quantified Consciousness - Michio Kaku

#775  Postby zoon » May 09, 2014 5:37 pm

kennyc wrote:The problem with Theory of Mind is that it presupposed consciousness. It is NOT consciousness nor does it have anything fundamental to do with consciousness, it is something extra that must be gotten rid of if consciousness is to be understood.

Would you agree that Theory of Mind is needed for a human brain to attribute consciousness? Not in order to have consciousness or awareness, but in order to attribute it to another or to itself?

A 2002 quote from ”Trends in Cognitive Sciences”:
Gallagher and Frith wrote:Our ability to explain and predict other people's behaviour by attributing to them independent mental states, such as beliefs and desires, is known as having a ‘theory of mind’. Interest in this very human ability has engendered a growing body of evidence concerning its evolution and development and the biological basis of the mechanisms underpinning it. Functional imaging has played a key role in seeking to isolate brain regions specific to this ability. Three areas are consistently activated in association with theory of mind. These are the anterior paracingulate cortex, the superior temporal sulci and the temporal poles bilaterally. This review discusses the functional significance of each of these areas within a social cognitive network.


A 2013 quote from the Massachusetts Institute of Technology:
Koster-Hale and Saxe wrote:In the decade since the last edition of Understanding Other Minds , the number of papers that use human neuroimaging tools to investigate the neural basis of theory of mind (ToM) has exploded from four (described in Frith & Frith’s 2000 chapter) to, as of 2013, well over 400. Studying ToM with neuroimaging works. Unlike many aspects of higher-level cognition, which tend to produce small and highly variable patterns of responses across individuals and tasks, ToM tasks generally elicit activity in an astonishingly robust and reliable group of brain regions.
Last edited by zoon on May 09, 2014 8:41 pm, edited 1 time in total.
User avatar
zoon
 
Posts: 3302

Print view this post

Re: Quantified Consciousness - Michio Kaku

#776  Postby GrahamH » May 09, 2014 7:06 pm

Thanks Zoon.

Searching "mPFC consciousness" turned up this: -

http://books.google.co.uk/books?id=jHI3 ... ss&f=false
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#777  Postby GrahamH » May 11, 2014 5:05 pm

zoon wrote:
GrahamH wrote:
zoon wrote:I don’t agree with GrahamH when he says that humans are conscious because we model ourselves. I think that we attribute consciousness primarily to other people, and also to ourselves and to animals such as squirrels, and we may then go further and attribute it to such things as thermostats or glove puppets, or even to everything in sight (panpsychism). Consciousness isn’t something we have, it’s something we’ve evolved to think we have.


I'm not sure if we disagree here or not. I agree with your last line. Consciousness isn’t something we have, it’s something we’ve evolved to think we have. I think that to think we have it is to model a subjective self as locus of experience. The congnitive organ understands that there is a subject having experiences because that is the semantics it generates to understand its own function in relation to the world.

I agree with you that I think I have consciousness when one part of my brain sets up a model of a self having experiences; where I’m less in agreement is that I think the modelling of selves with experiences is aimed at other people at least as much as at one’s own self. I see consciousness as something which we evolved to attribute primarily to other people (because the Theory of Mind processes are useful for predicting other people), so that while one’s own conscious experience can feel far more direct and undeniable than another person’s (as Descartes said), it’s not in fact the starting point for attributing consciousness. (In post #552 I quoted an article which mentioned children learning about their own false beliefs at the same time as learning about others’ false beliefs). We attribute consciousness (or awareness, or mental states) where something triggers the Theory of Mind processes in our brains, especially when the ToM processes give a better prediction than treating it as a mechanism, so we see other people and animals as having consciousness even though many of them (such as very young children) do not, so far as we know, attribute consciousness to themselves.


ToM that psychologists refer to as developing in infants is rather high-level conceptual intelligence. It seems highly likely that newborns understand that they are in pain, hungry etc. In Graziano's terms their brains must be attributing experiences to themselves as they model attention.

It is vital to remember that this 'attribution of consciousness' is not an experienced thought. It's not that we (subjects) think we are conscious.

There may be a lower level pre-conceptual ToM where a baby will cry if it sees/hears the responses of another to painful stimuli. It may not be able to anticipate or conceptualise what the other knows or experiences, as in a developed ToM, but it can recognise mental/emotional states in others.

You don't feel another's pain, but you are able to recognise that they are in pain. The difference is the essence of the Hard Problem. I think the explanation of the difference lies in the body-map frame of reference for a self model of mind.

Once you get to thinking about the mental states of others you are dealing with experience, presupposing consciousness. That level cannot answer the Hard Problem. We have to look to the brain functions that evaluate the state of others and compare them to those functions that might be evaluating the states of self. It's all related, but the focus is 'lower down', earlier in development, much less human-centric.

zoon wrote:
GrahamH wrote:ToM can similarly be confusing in that it can be taken as a high level conceptual thing inextricably tied to consciousness, or a low level perceptual thing, like 'knowing how to walk' or 'knowing a familiar face', that take no conscious effort.

ToM includes many different processes, some, like mirror neurones or perspective-taking, are low-level and automatic, while others, such as careful consideration about what someone else may know on some subject, are as high level as any part of our thinking. Prof Ian Apperley discusses a “two systems” approach to Theory of Mind in a 2009 paper which he links to here:

Ian Apperley wrote:In a recent paper with Stephen Butterfill I argue for a two-systems approach to everyday perspective-taking and social reasoning. We think it highly unlikely that a single psychological faculty for "theory of mind" could support both moment-by moment co-operation and competition, and the sort of complex psychological reasoning necessary for evaluating the guilt of the accused in a court of law. We suggest that these competing demands can only be satisfied by having distinct cognitive systems that make different trade-offs between flexibility and cognitive efficiency. Infants and some non-human animals have one or more cognitively efficient processes for "theory of mind", which also support adults' moment-by-moment social cognition. However, older children and adults have, in addition, a more flexible capacity for psychological reasoning that supports sophisticated judgements but makes heavy demands on memory and executive function. For us this sheds light on otherwise confusing patterns of success and failure in the theory of mind abilities of non-human animals, infants, children and adults. This line of thinking has motivated much of my recent empirical work.


Exactly so. 'Reasoning' is not a viable basis for conscious experience. Reasoning is related to consciousness.

We could think of it as navigating a virtual conceptual landscape. The neural mechanisms could be much the same as those that allow an animal to navigate a physical landscape - body and environment mapping - object tracking - trajectory modelling (predicting viable manoeuvres). It doesn't seem necessary for there to be subjective experience just to get around

Reasoning could add 'mental objects' and 'mental trajectories' as extentions of a motion control system. There is a theory of mind (I can't recall the source atm :oops: ) that proposes it arose from the capacity to navigate the physical environment and developed into navigation of an abstract 'mental landscape'.

We could make a distinction here between a capacity to detect and map 'mental objects' (experiences / intentions / dispositions etc) and a capacity to navigate around them. We could class the first capacity as consciousness where the map is in the domain of the self body map. Where the domain of the map is other entities we could call it "system 1 ToM". Adding the 'navigational capacity' gets us to "basic reasoning" and "System 2 ToM" - reasoning about consciousness, leading to self-awareness and introspection.
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#778  Postby GrahamH » May 11, 2014 5:29 pm

Here is Kaku's full talk from which the video in the OP was as an excerpt.



There's a lot to like in there, but it gets pretty nonsensical in places, with conscious thermostats and consciousness captured as maps of genome & connectome and projected into space on laser beams.

He surely can't think that a description of genome & connectome serialised on a carrier beam could be conscious. That really is absurd. As nutty as supposing that a biography is a life (is alive).

TO contrast that view of information with the self-model, the latter not that the information in the model is conscious, but that the processing of the model constitutes the formation of semantic links and that it's the dynamic responsiveness of this semantic system about self that 'is conscious'.
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

Re: Quantified Consciousness - Michio Kaku

#779  Postby DavidMcC » May 12, 2014 3:41 pm

GrahamH wrote:...
TO contrast that view of information with the self-model, the latter not that the information in the model is conscious, but that the processing of the model constitutes the formation of semantic links and that it's the dynamic responsiveness of this semantic system about self that 'is conscious'.

Don't you mean "...that is self-conscious"?
May The Voice be with you!
DavidMcC
 
Name: David McCulloch
Posts: 14913
Age: 70
Male

Country: United Kigdom
United Kingdom (uk)
Print view this post

Re: Quantified Consciousness - Michio Kaku

#780  Postby GrahamH » May 12, 2014 4:47 pm

DavidMcC wrote:
GrahamH wrote:...
TO contrast that view of information with the self-model, the latter not that the information in the model is conscious, but that the processing of the model constitutes the formation of semantic links and that it's the dynamic responsiveness of this semantic system about self that 'is conscious'.

Don't you mean "...that is self-conscious"?


No, 'self-conscious' has a distinct meaning of being conscious of being a self. The general definition of conscious is 'to know experiences', and thoughts about being a self, like any conscious thoughts, are experienced.

By 'semantic system about self' I mean that the semantics are, for example, {foot-pain} or {hunger-belly} and so on. Relations of some object/state w.r.t. a self.

'Self-consciousness' is related, of course, but being conscious does not entail being being conscious of being conscious. Animals might feel pains or hunger or lust without necessarily being aware of being a conscious entity.
Why do you think that?
GrahamH
 
Posts: 20419

Print view this post

PreviousNext

Return to Psychology & Neuroscience

Who is online

Users viewing this topic: No registered users and 1 guest