Information Theory, Complexity, & Dawkins' 747 (help?)

Christianity, Islam, Other Religions & Belief Systems.

Moderators: kiore, Blip, The_Metatron

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#201  Postby Shrunk » Jul 23, 2010 10:24 am

CookieJon wrote:No, but I tell you what... I think a lovely snowflake would be better than that picture of poo in the post above, and serve the same purpose.


I've made the same suggestion to him before, but he really likes the dogshit example. In hindsight, I see his point. It successfully tackles the creationist presupposition that "information" is some rarefied, exalted thing. But I still cringe every time I know that image is coming up.
"A community is infinitely more brutalised by the habitual employment of punishment than it is by the occasional occurrence of crime." -Oscar Wilde
User avatar
Shrunk
 
Posts: 26170
Age: 58
Male

Country: Canada
Canada (ca)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#202  Postby AMR » Jul 23, 2010 3:51 pm

xrayzed: So you've refuted Stenger's argument? I seem to have missed this. Perhaps you could point me towards your refutation.

then xrayzed says: As for your "critique" of Stenger - I'm amazed that you . . . either fail to understand his points, overlook key points, or simply make unsubstantiated assertions. That was remarkably feeble.

Look xrayzed you're the one who threw down Stenger as your refutation of my line of reasoning, I give you a paragraph on him and you then proceed to bitch that I didn't give sufficient attention to this Stenger's "argument" (Stenger's paper contains several arguments, what specific "argument" are you referring to, did YOU even bother reading the paper you linked to?, I doubt it) so I wade through this guy's paper, most of which I agree with substantially, point out some of the more obvious flaws in this man's main arguments, and you're again bitching at me.

So tell me what points did I overlook specifically?
AMR
 
Name: Aaron Rizzio
Posts: 44

Country: USA
United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#203  Postby tytalus » Jul 23, 2010 5:00 pm

AMR wrote:
I have analyzed 100 universes in which the values of the four parameters were generated randomly from a range five orders of magnitude above to five orders of magnitude below their values in our universe, that is, over a total range of ten orders of magnitude (Stenger 1995, 2000). I have also examined the distribution of stellar lifetimes for these same 100 universes (Stenger 1995, 2000). While a few are low, most are probably high enough to allow time for stellar evolution and heavy element nucleosynthesis. Over half the universes have stars that live at least a billion years.

Curiously he doesn't name the parameters in this paper, if one of them is the ratio of gravity to electromagnetism ~1:1036 even five orders of magnitude may not be that significant and yet still he gets stars to shine just a billion years which would be insufficient for multi-cellular life to emerge, let alone a civilization.

In another thread where Stenger's paper came up, I wondered if there was some humanocentric bias showing in the way creationists hold forth on the fine-tuning argument. Nice to have some evidential support for my hypothesis. :) That is some quality goalpost-moving there.
Futurama wrote: Bender: Dying sucks butt. How do you living beings cope with mortality?
Leela: Violent outbursts.
Amy: General slutiness.
Fry: Thanks to denial, I'm immortal.
User avatar
tytalus
 
Posts: 1228
Age: 51
Male

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#204  Postby Calilasseia » Jul 23, 2010 5:04 pm

I see our temporarily suspended supernaturalist has erected the fatuous "information is not a physical entity, therefore information is magic, therefore it needs a magic man" apologetic excrement. Oh dear. Time to reprise this.

Information is nothing more than the observational data available with respect to the current state of a system of interest. That is IT. Two rigorous mathematical treatments of information, namely Shannon's treatment and the treatment by Kolmogorov & Chaitin, are predicated on this fundamental notion. Indeed, when Claude Shannon wrote his seminal 1948 paper on information transmission, he explicitly removed ascribed meaning from that treatment, because ascribed meaning was wholly irrelevant to the analysis of the behaviour of information in a real world system. Allow me to present a nice example from the world of computer programming.

Here is a string of data (written as hexadecimal bytes):

81 16 00 2A FF 00

Now, to an 8086 processor, this string of bytes codes for a single 8086 machine language instruction, namely:

ADC [2A00H], 00FFH

which adds the immediate value of 00FFH (255 decimal) to whatever value is currently stored at the 16-bit memory location addressed by DS:2A00H (note that 8086 processors use segmented memory addressing, with DS implied as the default segment register unless the base address is of the form [BP+disp], in which case the default segment register is SS).

However, on an older, 8-bit 6502 processor, the above sequence codes for multiple instructions, namely the following sequence:

CLC
ASL ($00,X)
LDX #$FF
BRK

The first of these instructions clears the carry flag in the processor status register P. The second instruction takes the operand $00, adds to it the contents of the X register (8-bit addition only), and uses that computed address (call this N) as an index into the first 256 bytes of memory (page zero). The contents of address N and address N+1 together then constitute a 16-bit pointer into another location in memory. The 8-bit contents of this location are then shifted one bit position left (ASL stands for Arithmetic Shift Left). The third instruction loads the contents of the X register with the immediate value $FF (255 decimal). The third instruction, BRK, is a breakpoint instruction, and performs a complex sequence of operations. First, it takes the current value of the program counter (PC), which is now pointing at the BRK instruction, adds 2 to that value, and pushes it onto the stack (2 bytes are therefore pushed). It then pushes the contents of the processor status register P. Then, it loads the contents of the memory locations $FFFE and $FFFF (the top 2 locations in the 6502 address space) into the program counter and continues execution from there. The top end of memory in a 6502 system typically consists of ROM, and the hard-coded value stored in locations $FFFE/$FFFF is typically a vector to a breakpoint debugging routine in ROM, but that's an implementation dependent feature, and the exact contents of $FFFE/$FFFF vary accordingly from system to system.

To make matters even more interesting, the bytes also have meaning to a Motorola 6809 processor, viz:

CMPA #$16
NEG $2AFF
NEG ??

The first instruction is "compare accumulator A with the value $16 (22 decimal)". This performs an implicit subtraction of the operand $16 from the current contents of accumulator A, sets the condition codes (CC) according to whether the result is positive, zero or negative (and also sets other bits allowing more intricate comparisons to be made) but discards the actual result of the subtraction. The next instruction, NEG $2AFF, takes the contents of the memory location at address $2AFF (decimal 11,007), and negates it (so that a value of +8 becomes -8 and vice versa, assuming 2's complement storage). The next instruction is incomplete, hence the ?? operand, because the NEG opcode (the 00 byte) needs two following bytes to specify a memory address in order to specify which memory location's contents to negate. So, whatever two bytes follow our 6-byte stream will become the address operand for this NEG instruction.

Now, that's ONE stream of bytes, which has THREE different meanings for three different processors. Therefore ascribing meaning to the byte stream as part of the process of analysing transmission of the data is erroneous. Meaning only becomes important once the data has been transmitted and received, and the receiver decides to put that data to use. If we have three different computers receiving this 6-byte stream from appropriate sources, then the Shannon information content of each bit stream is identical, but our three different computers will ascribe totally different meanings to the byte stream, if they are regarded as part of a program instruction sequence. An 8086-based computer will regard the byte stream as an ADC instruction, the 6502-based computer will regard it as a CLC, ASL, LDX, BRK sequence, and the 6809-based computer will regard it as a CMPA, NEG, NEG sequence (and the latter will demand two more bytes to be transmitted in order to complete the last instruction).

Consequently, ascribed meaning is wholly irrelevant to the rigorous treatment of information. Creationists routinely introduce the error of assuming a priori that "information" and "ascribed meaning" are synonymous, which the above example refutes wholesale (along with thousands of others that could be posted if I had the time). Of course, creationists conflate information with ascribed meaning deliberately, because they seek to expound the view that information is a magic entity, and therefore requires an invisible magic man in order to come into existence. This is complete rot, as the Shannon and Kolmogorov/Chaitin analyses of information demonstrate readily, not to mention Turing's large body of work with respect to information. All that matters, at bottom, is that the entities and interactions applicable to a given system of interest produce different results when applied to different states of that system. Information sensu stricto, namely the observational data available with respect to the current state of a system, only becomes "meaningful" when different states lead to different outcomes during appropriate interactions applicable to the system, and the only "meaning" that matters, at bottom, is what outcomes result from those different system states, which in the case of the computer data above, differs from system to system.

As a corollary of the above, all that matters, in a rigorous treatment, is the following:

[1] What states a physical system of interest can occupy;

[2] What interactions take place as a result of these physical states.

Indeed, this is the entire basis upon which Turing machines are founded. Turing machines consist, at their core, of a symbol state table, each element of which is indexed as a two-dimensional array by the symbol being read, and the current state of the machine performing the reading. Each of those elements contains the following:

[1] What symbol to replace the current symbol with at the current reading position;

[2] What movement to perform in order to change the reading position and acquire the next symbol;

[3] What state the machine should adopt after performing the above operations.

This blind, mechanical, deterministic system is the foundation upon which modern computing is built. ALL digital processors are, in effect, electronic embodiments of Turing machines, and they are manifestly physical entities - physical entities, moreover, that are capable of processing information at high speed. At bottom, all that they are doing is manipulating the flow of electrons within their semiconductor substrates, yet this manipulation of electrons allows us to perform such tasks as computing fast Fourier transforms, run simulations of everything from weather systems to military aircraft, and, for those familiar with the Isabelle suite of software, allows us to check mathematical proofs for consistency and rigour. No magic was required to put Isabelle together, all that was required was for sufficiently informed humans to work out what processes would result in the relevant end product, and instantiate that as a huge collection of bits, once again stored in an appropriate medium. The point here being that if you change the physical arrangement of that medium, you change its information content. The physical arrangement of any storage medium determines the information content of that medium. What another system does with that content, of course, is again a function of its physical constitution, and the interactions permitted by that physical constitution, which in turn leads to the conclusion that ascribed meaning is nothing more than the particular set of interactions arising from the physical structure of the entity ascribing the meaning. Therefore creationist attempts to conflate ascribed meaning with information fail not only for lack of rigour, but fail because at bottom, ascribed meaning itself can be traced to a physical basis with respect, at least, to Turing machines and any entity that can be modelled by them.

Now, Perry Marshall (a creationist who is fond of specious "information" pseudo-arguments) erects the bogus argument that DNA is a "code". This IS bogus. DNA is simply an organic molecule that is capable of existing in a large number of states, each of which results in a different outcome with respect to the chemical interactions that the molecule takes part in. Because it can exist in a large number of states, because those states are all associated with specific, systematic interactions (such as the production of a particular protein after transcription), and because those states are coupled to those systematic and well-defined interactions in a largely one-to-one manner (for the time being, I'll leave to one side complications such as selenocysteine, which were afterthoughts grafted onto the original system), they can be treated in an information-theoretic manner as if they constituted a "code", because doing so simplifies our understanding of those systematic interactions, and facilitates further detailed analysis of that system. That, again, is IT. The idea that DNA constitutes a "code" intrinsically is merely a baseless creationist assertion resulting from deliberate apologetic misrepresentation of the code analogy. A misrepresentation that itself is then subject to rampant discoursive misuse, because the argument erected consists of:

[1] DNA is a code (unsupported baseless assertion);

[2] All codes are produced by "intelligence" (deliberately omitting the fact that the only "intelligence" we have evidence of that produces codes is human intelligence);

[3] Therefore an "intelligence" produced DNA (the inference being that this "intelligence" is supernatural, which doesn't even arise as a corollary from [2] when one factors in the omitted detail, that the only "intelligence" we have evidence for as a code producer is human, and therefore natural, intelligence).

This argument is fatuous as it stands, even without factoring in extra scientific knowledge that has been acquired in relatively recent times, but when we do factor in this knowledge, it becomes absurd to the Nth degree. That scientific knowledge consists of at least eleven scientific papers demonstrating that the "genetic code" is itself an EVOLVABLE ENTITY. The eleven papers in my collection are:

A Co-Evolution Theory Of The Genetic Code by J. Tze-Fei Wong, Proceedings of the National Academy of Sciences of the USA, 72(5): 1909-1912 (May 1975)

A Mechanism For The Association Of Amino Acids With Their Codons And The Origin Of The Genetic Code by Shelley D. Copley, Eric Smith and Harold J. Morowitz, Proceedings of the National Academy of Sciences of the USA, 102(12): 4442-4447 (22nd March 2005)

Collective Evolution And The Genetic Code by Kalin Vetsigian, Carl Woese and Nigel Goldenfeld, Proceedings of the National Academy of Sciences of the USA, 103(28): 10696-10701 (11th July 2006)

Emergence Of A Code In The Polymerisation Of Amino Acids Along RNA Templates by Jean Lehmann, Michel Ciblis and Albert Libchaber, Public Library of Science One, 4(6): e5773 (DOI: 10.1371/journalpone.0005773, 3rd June, 2009)

Evolution Of Amino Acid Frequencies In Proteins Over Deep Time: Inferred Order of Introduction Of Amino Acids Into The Genetic Code by Dawn J. Brooks, Jacques R. Fresco, Arthur M. Lesk and Mona Singh, [/i]Molecular and Biological Evolution[/i], 19(10): 1645-1655 (2002)

Evolution Of The Genetic Code: Partial Optimisation Of A Random Code For Robustness To Translation Error In A Rugged Fitness Landscape by Artem S. Nvozhilov, Yur I. Wolf and Eugene V. Koonin, Biology Direct, {b]2:[/b] 24 (23rd October 2007)

Importance Of Compartment Formation For A Self-Encoding System by Tomoaki Matsuura, Muneyoshi Yamaguchi, Elizabeth P. Ko-Mitamura, Yasufumi Shima, Itaru Urabe and Tetsuya Yomo, Proceedings of the National Academy of Sciences of the USA, 99(11): 7514-7517 (28th May 2002)

On The Origin Of The Genetic Code: Signatures Of Its Primordial Complementarity In tRNAs And Aminoacyl-tRNA Synthetases by S. N. Rodin and A. S. Rodin, Heredity, 100: 341-355 (5th March 2008)

Recent Evidence For Evolution Of The Genetic Code by Syozo Osawa, Thomas H. Jukes, Kimitsuna Watanabe and Akiro Muto, Microbiological Reviews, 56(1): 229-264 (March 1992)

Rewiring The Keyboard: Evolvability Of The Genetic Code by Robin D. Knight, Stephen J. Freeland and Laura F. Landweber, Nature Reviews Genetics, [b]2: 49-58 (January 2001)

A Simple Model Based On Mutation And Selection Explains Trends In Codon And Amino-Acid Usage And GC Composition Within And Across Genomes by Robin D. Knight, Stephen J. Freeland and Laura F. Landweber, Genome Biology, 2(4): research0010.1–0010.13 (22nd March 2001)

This collection of papers is incomplete, as more have been published in the relevant journals since I compiled this list.

So, since we have peer reviewed scientific papers demonstrating that the "genetic code" is itself an evolvable entity, and indeed, since scientists have published experimental work investigating the behaviour of alternative genetic codes arising from this research, the idea that an invisible magic man was needed for this is recrudescently nonsensical.

I think this covers relevant important bases.
Signature temporarily on hold until I can find a reliable image host ...
User avatar
Calilasseia
RS Donator
 
Posts: 22352
Age: 61
Male

Country: England
United Kingdom (uk)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#205  Postby AMR » Jul 23, 2010 6:13 pm

tytalus wrote:

In another thread where Stenger's paper came up, I wondered if there was some humanocentric bias showing in the way creationists hold forth on the fine-tuning argument. Nice to have some evidential support for my hypothesis. :) That is some quality goalpost-moving there.

Frankly, I don't know what Stenger was really talking about in that paragraph because he didn't show his math (or maths for you Brits). If you can find a link to his 100 scenarios published anywhere on the internets please let me know. :coffee:
AMR
 
Name: Aaron Rizzio
Posts: 44

Country: USA
United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#206  Postby tytalus » Jul 23, 2010 6:45 pm

AMR wrote:
tytalus wrote:
In another thread where Stenger's paper came up, I wondered if there was some humanocentric bias showing in the way creationists hold forth on the fine-tuning argument. Nice to have some evidential support for my hypothesis. :) That is some quality goalpost-moving there.

Frankly, I don't know what Stenger was really talking about in that paragraph because he didn't show his math (or maths for you Brits). If you can find a link to his 100 scenarios published anywhere on the internets please let me know. :coffee:

Thanks for the tacit surrender on moving the goalposts. :) In the meantime, out of curiosity I tracked down Stenger's page on Anthropics. He does seem to have a paper on Natural Explanations for Anthropic Coincidences, which details the variables that he used and plots the results; a paper on his MonkeyGod program with samples of some of the variables and results; and a link to the program itself on the web. But I can't find all 100 scenarios that he tested.
Futurama wrote: Bender: Dying sucks butt. How do you living beings cope with mortality?
Leela: Violent outbursts.
Amy: General slutiness.
Fry: Thanks to denial, I'm immortal.
User avatar
tytalus
 
Posts: 1228
Age: 51
Male

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#207  Postby hotshoe » Jul 23, 2010 7:50 pm

tytalus wrote:
AMR wrote:
tytalus wrote:
In another thread where Stenger's paper came up, I wondered if there was some humanocentric bias showing in the way creationists hold forth on the fine-tuning argument. Nice to have some evidential support for my hypothesis. :) That is some quality goalpost-moving there.

Frankly, I don't know what Stenger was really talking about in that paragraph because he didn't show his math (or maths for you Brits). If you can find a link to his 100 scenarios published anywhere on the internets please let me know. :coffee:

Thanks for the tacit surrender on moving the goalposts. :) In the meantime, out of curiosity I tracked down Stenger's page on Anthropics.

I don't see where AMR provided the link, but that's the paper AMR is responding to, in this post: http://www.rationalskepticism.org/general-faith/information-theory-complexity-dawkins-747-help-t8143-180.html#p364537

Thanks for the link to the Monkey God program. Cool stuff !
Now, when I talked to God I knew he'd understand
He said, "Stick by my side and I'll be your guiding hand
But don't ask me what I think of you
I might not give the answer that you want me to"
hotshoe
 
Posts: 3177

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#208  Postby AMR » Jul 23, 2010 7:59 pm

Thanks for the links tytalus. Stenger includes a plot of his universes in one of his papers (plotted along the x axis interestingly in logarithms). He manages to get 2 of his 100 universe scenarios to last as long as 1.5 billion years. Enough to synthesize a periodic table sure but not to nurture an intelligent civilization. Recall the Earth was entirely molten and bombarded by Mars size planetoids for the first ~billion years of its existence. I don't know why he would think this program helps his thesis.

Other problems 1. Only four parameters are tested, he picked 'em so I assume he picked the four out of 20-31 odd that would best make his case. 2. He himself concedes the cosmological constant is an intractable problem that renders all universes within a margin of close to 10120 into hard vacuum. 3. As has been mentioned already on this post GIGO, computer programs can be written that incorporate any number of biases.

hotshoe: the link was in xrayzed's post http://www.colorado.edu/philosophy/vste ... neTune.pdf
AMR
 
Name: Aaron Rizzio
Posts: 44

Country: USA
United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#209  Postby hotshoe » Jul 23, 2010 8:04 pm

AMR wrote:Thanks for the links tytalus. Stenger includes a plot of his universes in one of his papers (plotted along the x axis interestingly in logarithms). He manages to get 2 of his 100 universe scenarios to last as long as 1.5 billion years. Enough to synthesize a periodic table sure but not to nurture an intelligent civilization. Recall the Earth was entirely molten and bombarded by Mars size planetoids for the first ~billion years of its existence. I don't know why he would think this program helps his thesis.

Other problems 1. Only four parameters are tested, he picked 'em so I assume he picked the four out of 20-31 odd that would best make his case. 2. He himself concedes the cosmological constant is an intractable problem that renders all universes within a margin of close to 10120 into hard vacuum. 3. As has been mentioned already on this post GIGO, computer programs can be written that incorporate any number of biases.

hotshoe: the link was in xrayzed's post http://www.colorado.edu/philosophy/vste ... neTune.pdf


Right, thanks, we all agree which Stenger article we're discussing, we just disagree with your conclusion supposedly based on the article.
Now, when I talked to God I knew he'd understand
He said, "Stick by my side and I'll be your guiding hand
But don't ask me what I think of you
I might not give the answer that you want me to"
hotshoe
 
Posts: 3177

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#210  Postby tytalus » Jul 23, 2010 8:28 pm

Actually, I don't 'recall' this state of the Earth for a billion years after its formation. What data I can find on the subject indicates otherwise. But it is interesting to see this smooth shift from fine-tuning for 'life' to fine-tuning for 'intelligent civilization'.
Futurama wrote: Bender: Dying sucks butt. How do you living beings cope with mortality?
Leela: Violent outbursts.
Amy: General slutiness.
Fry: Thanks to denial, I'm immortal.
User avatar
tytalus
 
Posts: 1228
Age: 51
Male

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#211  Postby xrayzed » Jul 27, 2010 3:48 am

AMR wrote:
xrayzed: So you've refuted Stenger's argument? I seem to have missed this. Perhaps you could point me towards your refutation.

then xrayzed says: As for your "critique" of Stenger - I'm amazed that you . . . either fail to understand his points, overlook key points, or simply make unsubstantiated assertions. That was remarkably feeble.

Look xrayzed you're the one who threw down Stenger as your refutation of my line of reasoning, I give you a paragraph on him and you then proceed to bitch that I didn't give sufficient attention to this Stenger's "argument" (Stenger's paper contains several arguments, what specific "argument" are you referring to, did YOU even bother reading the paper you linked to?, I doubt it) so I wade through this guy's paper, most of which I agree with substantially, point out some of the more obvious flaws in this man's main arguments, and you're again bitching at me.

So tell me what points did I overlook specifically?

How about "pretty much all of it".

Your response was basically head-nodding to a few points, a bit of standard handwaving, considerable misreading, a touch of bitching about "straw men", plus a fair dollop of gibberish and non-sequiturs. About the only point where do try to deal with the actual content you fuck it up.

AMR wrote:
(Stenger)
I have analyzed 100 universes in which the values of the four parameters were generated randomly from a range five orders of magnitude above to five orders of magnitude below their values in our universe, that is, over a total range of ten orders of magnitude (Stenger 1995, 2000). I have also examined the distribution of stellar lifetimes for these same 100 universes (Stenger 1995, 2000). While a few are low, most are probably high enough to allow time for stellar evolution and heavy element nucleosynthesis. Over half the universes have stars that live at least a billion years.

Curiously he doesn't name the parameters in this paper...


Wrong. He clearly mentions the parameters on the previous page.
(Stenger)
I have made a modest attempt to obtain some feeling for what a universe with different constants would be like. Press and Lightman (1983) have shown that the physical properties of matter, from the dimensions of atoms to the order of magnitude of the lengths of the day and year, can be estimated from the values of just four fundamental constants (this analysis is slightly different from Carr and Rees [1979 ]). Two of these constants are the strengths of the electromagnetic and strong nuclear interactions. The other two are the masses of the electron and proton.


And you accuse me of not reading the paper. Nice one. :lol:
A thinking creationist is an oxymoron. A non-thinking creationist is just a moron.
(Source: johannessiig, here)
User avatar
xrayzed
 
Posts: 1053
Age: 64
Male

Jolly Roger (arr)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#212  Postby AMR » Jul 27, 2010 5:46 am

xrayzed wrote:
AMR wrote:
So tell me what points did I overlook specifically?

How about "pretty much all of it".

Your response was basically head-nodding to a few points, a bit of standard handwaving, considerable misreading, a touch of bitching about "straw men", plus a fair dollop of gibberish and non-sequiturs. About the only point where do try to deal with the actual content you fuck it up.

Wrong. He clearly mentions the parameters on the previous page.
(Stenger)
I have made a modest attempt to obtain some feeling for what a universe with different constants would be like. Press and Lightman (1983) have shown that the physical properties of matter, from the dimensions of atoms to the order of magnitude of the lengths of the day and year, can be estimated from the values of just four fundamental constants (this analysis is slightly different from Carr and Rees [1979 ]). Two of these constants are the strengths of the electromagnetic and strong nuclear interactions. The other two are the masses of the electron and proton.


And you accuse me of not reading the paper. Nice one. :lol:


And you, of course, miss the central point of my refutation of Stenger's MonkeyGod simulation effort. Stenger's own program (detailed in his paper NATURAL EXPLANATIONS FOR THE ANTHROPIC COINCIDENCES -- what you originally linked to is apparently a rough draft of that paper) only explores one aspect of a putative "randomly generated" universe: stellar lifetimes. Yet all stellar lifetimes estimated by his creaky program are inferior to those we happen to enjoy in our universe. Our sun is projected to remain on the main sequence for approximately 10 billion years. Stenger's stars, generated with parameters artificially constrained within 10 orders of magnitude he says, only manage to last 1.5 billion years and these were his best results obtained in only 2 of his 100 simulations. If his contention, that a wide range of physical parameters would also result in a liveable universe, was correct one would expect something like a range of stellar lifetimes some similar to "our" universe, and some lifetimes perhaps even longer but ALL WERE INFERIOR, and by a wide margin. :smoke:

tytalus: In 2008 a team from McGill University found 4.28 billion year rock in northern Quebec, the Earth's age is usually estimated to be 4.6 billion years; the first life forms, primitive microfossils, are dated at about 3.5 billion years. The Cambrian explosion of biodiversity took place only about 550 million years ago suggesting that evolution of multicellular cephalization -- intelligence -- is even more difficult and requires more time than the emergence of life itself. As for "goalposts", how about any universe that wouldn't be rendered into hard vacuum within a few parts in 10120? The whole discussion of stellar lifetimes is about a 5th order concern once you get past the cosmological constant problem.
AMR
 
Name: Aaron Rizzio
Posts: 44

Country: USA
United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#213  Postby tytalus » Jul 27, 2010 7:56 pm

AMR wrote:
tytalus: In 2008 a team from McGill University found 4.28 billion year rock in northern Quebec, the Earth's age is usually estimated to be 4.6 billion years; the first life forms, primitive microfossils, are dated at about 3.5 billion years. The Cambrian explosion of biodiversity took place only about 550 million years ago suggesting that evolution of multicellular cephalization -- intelligence -- is even more difficult and requires more time than the emergence of life itself. As for "goalposts", how about any universe that wouldn't be rendered into hard vacuum within a few parts in 10120? The whole discussion of stellar lifetimes is about a 5th order concern once you get past the cosmological constant problem.

That sounds like a nice find, that rock. Unfortunately it does not help your debunked claims about the state of the earth at that time. Recall:

Recall the Earth was entirely molten and bombarded by Mars size planetoids for the first ~billion years of its existence. I don't know why he would think this program helps his thesis.

So, the best you can do to justify your goalpost-moving is an attempted tu quoque? Well played. :)
Futurama wrote: Bender: Dying sucks butt. How do you living beings cope with mortality?
Leela: Violent outbursts.
Amy: General slutiness.
Fry: Thanks to denial, I'm immortal.
User avatar
tytalus
 
Posts: 1228
Age: 51
Male

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#214  Postby AMR » Jul 27, 2010 10:17 pm

tytalus wrote:
That sounds like a nice find, that rock. Unfortunately it does not help your debunked claims about the state of the earth at that time. . . .
So, the best you can do to justify your goalpost-moving is an attempted tu quoque? Well played. :)

The Hadean lasted between 4.6 & 3.8 billion years ago. I round that to a billion (~1 billion year time span), so sue me. :roll:
AMR
 
Name: Aaron Rizzio
Posts: 44

Country: USA
United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#215  Postby tytalus » Jul 28, 2010 12:10 am

AMR wrote:Recall the Earth was entirely molten and bombarded by Mars size planetoids for the first ~billion years of its existence. I don't know why he would think this program helps his thesis.

:awesome:
Futurama wrote: Bender: Dying sucks butt. How do you living beings cope with mortality?
Leela: Violent outbursts.
Amy: General slutiness.
Fry: Thanks to denial, I'm immortal.
User avatar
tytalus
 
Posts: 1228
Age: 51
Male

United States (us)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#216  Postby Oldskeptic » Jul 28, 2010 1:39 am

AMR wrote:

I find that long-lived stars that could make life more likely will occur over a wide range of these parameters. For example, if we take the electron and proton masses to be equal to their values in our universe, an electromagnetic force strength having any value greater than its value in our universe will give a stellar lifetime of more than 680 million years. If we had an electron mass 100,000 times lower, the proton mass could be as much as 1,000 times lower to achieve the same minimum stellar lifetime. This is hardly fine-tuning. Many more constants are needed to fill in the details of our universe.


AMR wrote:
680 million years is hardly sufficient for singe-celled life to develop.


I have analyzed 100 universes in which the values of the four parameters were generated randomly from a range five orders of magnitude above to five orders of magnitude below their values in our universe, that is, over a total range of ten orders of magnitude (Stenger 1995, 2000). I have also examined the distribution of stellar lifetimes for these same 100 universes (Stenger 1995, 2000). While a few are low, most are probably high enough to allow time for stellar evolution and heavy element nucleosynthesis. Over half the universes have stars that live at least a billion years.


AMR wrote:
Curiously he doesn't name the parameters in this paper, if one of them is the ratio of gravity to electromagnetism ~1:1036 even five orders of magnitude may not be that significant and yet still he gets stars to shine just a billion years which would be insufficient for multi-cellular life to emerge, let alone a civilization.


One gross and fatal assumption is that only one kind of life, ours, is conceivable in every conceivable configuration of universes. However, a wide variation of constants of physics leads to universes that are long-lived enough for life to evolve, although human life need not exist in such universes. Although not required to negate the fine-tuning argument, which falls of its own weight, other universes besides our own are not ruled out by fundamental physics and cosmology. The theory of a multiverse composed of many universes with different laws and physical properties is actually more parsimonious, more consistent with Occam's razor, than a single universe.


AMR wrote:
I kid you not these were Stenger's best arguments from probably the most cited paper on the internet critical of fine tuning arguments -- and I agree with probably half of it. 1. No form of life could emerge out of hard vacuum which is what the universe would be within a margin of a few parts in 10120. 2. In 100 tries he could only get half his universes' stars to last a billion years which would be insufficient for the emergence and evolution of live and development of civilization.


This concerns the bold part above:

There is a problem with the cosmological constant, on one hand it is predicted to be very large, but on the other it is measured to be very small. And guess what this problem is called? It is called a “fine tuning problem”, but not for the reasons that you might think. Because in science dealing with hypotheses/theories there is a problem when any factor needs to be finely tuned to make the hypothesis/theory work. It is especially a problem when different hypotheses/theories need to fine tune their factor in different ways. As in the cosmological constant where quantum theory predicts its value to be high and cosmology concerning accelerating expansion of the universe measures it to be very low.

What this scientific fine tuning problem indicates is that the hypothesis/theory is either wrong or that there is something that someone hasn’t figured out yet. And since the standard model of quantum mechanics as a theory has the proven tract record of making astonishingly accurate predictions, and dark energy is an educated hypothesis I will go with QM for now.

Stenger has pointed out as have others that there are alternatives to the cosmological constant concerning expansion. There is not really an alternative to the standard model of QM.

So your constant repetition of the number 10^120 is not the magic bullet that you think it is. This number is smack dab in the middle of raging scientific debate and the verdict is not in.

Now for the rest:
You seem to think that the evolution of intelligent life depends on stars that live at least as long as ours has, and you are probably correct in one aspect given that the only intelligent life that we know of revolves around a sun like ours. But all life similar to ours no matter where it appears would be at least or more dependent on extremely large stars such as population III stars and super-giants that tend to go supernova, and spread heavy elements far and wide within 30,000,000 years of formation.

The thing is that all of the elements heavier than hydrogen and helium and lithium, and lighter than iron were created within extremely massive and short lived stars that went boom, and those elements heavier than iron were created in the concussion wave of the super-nova.

These massive early type III stars contained no elements as heavy as iron, in fact no heavy elements at all until they manufactured them. They are called metal-free stars, and because of the lack of heavy elements at the beginning they were very large and very hot, and went through their stellar evolution very quickly. Then they exploded and spread their components far and wide. Second generation stars formed from the debris, and the process continued with stars less large but still super-giants that formed more heavy elements and went boom with 30,000,000 years. These are the metal-poor stars; Type II.

The fact is that the more metal contained in a star the smaller it will be and temperatures will be lower. This leads to smaller stars like our sun lasting much longer, and never going super-nova.

680,000,00 years is a short lived star by comparison to stars like the one that we depend on, but it is longed lived for the types of stars that produce stars like ours and planets.

I would also like to point out that Stenger said “I have also examined the distribution of stellar lifetimes for these same 100 universes (Stenger 1995, 2000). While a few are low, most are probably high enough to allow time for stellar evolution and heavy element nucleosynthesis. Over half the universes have stars that live at least a billion years.”


A billion years seems to be his lower metric not a limit. And I am not clear on which type of stars he is talking about. If it is the early type III and later super-giants then a billion years is more than sufficient for any universe to form longer lived stars and with them planets where life could evolve.
There is nothing so absurd that some philosopher will not say it - Cicero.

Traditionally these are questions for philosophy, but philosophy is dead - Stephen Hawking
User avatar
Oldskeptic
 
Posts: 7395
Age: 66
Male

Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#217  Postby hackenslash » Jul 28, 2010 2:04 am

AMR wrote:Interesting talk by Krauss, years ago I enjoyed his Physics of Star Trek, thanks for the link.

But of course a cosmos in which "nothing is unstable" presupposes something doesn't it? Namely a cosmos in which nothing is unstable. . . which is, in fact, something. Touché.


Errr, no. That nothing is unstable (actually, it's worse than that, it's impossible) is a proven fact, and stems from one of our most successful and accurate scientific principles. The uncertainty principle isnt a though experiment, it's a categorical feature of the universe, and beyond any serious questioning.

As for the OP, and the intervening arguments, I agree completely with JustATheory, but I also have other issues with it, and with Dawkins' argument as well, and it stems from use of the word 'complex'. Dawkins is actually misusing the word in his argument. Certainly, Kolmogorov complexity is a useful concept in information theory, but it mustn't be confused with complexity which, in this instance, is actually being treated as synonymous with complicatedness, when synonymous it isn't. Complexity describes emergence. Complicatedness describes the interdependence of many parts. As an example, a car is complicated. The behaviour exhibited by a car and a human combined is complex. How this applies to a deity (should such a ludicrous entity actually exist) is unclear, and it can't be stated which term would apply until such time as a deity is observed to make the distinction.
hackenslash
 
Name: The Other Sweary One
Posts: 22910
Age: 53
Male

Country: Republic of Mancunia
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#218  Postby Oldskeptic » Jul 28, 2010 2:23 am

AMR wrote:
Recall the Earth was entirely molten and bombarded by Mars size planetoids for the first ~billion years of its existence.


What are you talking about? Earth’s crust had solidified within about 10,000,000 years and if even one Mars size asteroid had hit earth it would have been all over right then. Mars is over half the size of Earth, and given this mass and the kinetic energy involved both Earth and this Mars size asteroid would have been scattered into tiny pieces.

The largest asteroid know is Ceres with a radius of 487 kilometers. Mars has a radius of 3,3692 km. If Earth was bombarded by Mars size asteroids then Earth must have used them all up and all of those in between in the process.

Where do you get his stuff? Certainly not from reading anything that is actually based on valid science that you actually understand.

What Earth was most probably bombarded with were comets that are composed of something called ice, which when melted becomes liquid H2O commonly called water. That is why we have things called oceans and lakes and rivers and clouds.
There is nothing so absurd that some philosopher will not say it - Cicero.

Traditionally these are questions for philosophy, but philosophy is dead - Stephen Hawking
User avatar
Oldskeptic
 
Posts: 7395
Age: 66
Male

Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#219  Postby xrayzed » Jul 28, 2010 2:45 am

AMR wrote:And you, of course, miss the central point of my refutation of Stenger's MonkeyGod simulation effort. Stenger's own program (detailed in his paper NATURAL EXPLANATIONS FOR THE ANTHROPIC COINCIDENCES -- what you originally linked to is apparently a rough draft of that paper) only explores one aspect of a putative "randomly generated" universe: stellar lifetimes...

Nice failure to admit you screwed up. Whatever. :cheers:

Stenger's randomly generated didn't generate a universe with stars that last as long as the ones in our universe. So? Even accepting those are the best possible cases (out of a sample of 100 I'd suggest it isn't likely) this only matters if you can show that the time it took life to evolve on Earth is the absolute minimum time required for life to evolve in all possible universes. Good luck with that.

Of course this all presumes that the parameters in question could have been different. Let us know when you've confirmed that as well.
A thinking creationist is an oxymoron. A non-thinking creationist is just a moron.
(Source: johannessiig, here)
User avatar
xrayzed
 
Posts: 1053
Age: 64
Male

Jolly Roger (arr)
Print view this post

Re: Information Theory, Complexity, & Dawkins' 747 (help?)

#220  Postby AMR » Jul 30, 2010 12:05 am

Oldskeptic
. . . in science dealing with hypotheses/theories there is a problem when any factor needs to be finely tuned to make the hypothesis/theory work. It is especially a problem when different hypotheses/theories need to fine tune their factor in different ways. As in the cosmological constant where quantum theory predicts its value to be high and cosmology concerning accelerating expansion of the universe measures it to be very low.

What this scientific fine tuning problem indicates is that the hypothesis/theory is either wrong or that there is something that someone hasn’t figured out yet. . . So your constant repetition of the number 10^120 is not the magic bullet that you think it is. This number is smack dab in the middle of raging scientific debate and the verdict is not in.


Certainly science is an ongoing dialectical process, there are no settled canons in science so in principal anything is subject to reappraisal should new evidence emerge or new theories be developed. Your view, that there must be something accounting for the appearance that our universe was poised on a knife-edge between void and singularity is shared by Steven Weinberg but he also rejects a possible solution Stenger points to in the theory of "Quintessence":
The Cosmological Constant Problems(PDF)

A billion years seems to be his lower metric not a limit. And I am not clear on which type of stars he is talking about.
Neither do I, Stenger should make it clear what population of stars he is talking about. Actually the formulae he is plugging into his program make no account of metallicity, it's meant to be just a rough first approximation; but if the universe were as flexible as Stenger claims why do none of the simulations approach or exceed what we observe? I think this is why he buries is results behind the base 10 logs.

It should also be pointed out that there is a vary broad range of stellar lifetimes in our universe: blue giants can burn through their cores in a few hundred million years and red dwarves can burn for 100 billion years.

ts = (α2/aG) (mp/me)2 h / (mp c2)-1
The four variable parameters of his program simulation are:
α, the electromagnetic interaction strength e2/h / c
αs, the strong nuclear interaction strength at low energy
me, the mass of the electron
mp, the mass of the proton

Oldskeptic: Where do you get his stuff? Certainly not from reading anything that is actually based on valid science that you actually understand.
I'm not making this stuff up, really . . . .
http://en.wikipedia.org/wiki/History_of_Earth#The_giant_impact_hypothesis
(New evidence suggests the Moon formed even later, 4.48±0.02 Ga, or 70–110 Ma after the start of the Solar System) The Moon has a bulk composition closely resembling the Earth's mantle and crust together, without the Earth's core. This has led to the giant impact hypothesis, the idea that the Moon was formed during a giant impact of the proto-Earth with another protoplanet by accretion of the material blown off the mantles of the proto-Earth and impactor.
The impactor, sometimes named Theia, is thought to have been a little smaller than the current planet Mars.


hackenslash
Errr, no. That nothing is unstable (actually, it's worse than that, it's impossible) is a proven fact, and stems from one of our most successful and accurate scientific principles. The uncertainty principle isnt a though experiment, it's a categorical feature of the universe, and beyond any serious questioning.

As I understand classical QM the uncertainty principal requires observers; what observers were there in the beginning of the universe (I also understand that interpretation is controversial, but then again so is your explanation above ["beyond any serious questioning"?])? But why should such a thing as Heisenberg uncertainty exist in the first place? What is more likely a roiling probabilistic mass/energy relationship that gives birth to a finely-tuned organic existence or just simply nothing? and why should it generate a thing as improbable as our universe?
AMR
 
Name: Aaron Rizzio
Posts: 44

Country: USA
United States (us)
Print view this post

PreviousNext

Return to Theism

Who is online

Users viewing this topic: No registered users and 1 guest