#204
by Calilasseia » Jul 23, 2010 5:04 pm
I see our temporarily suspended supernaturalist has erected the fatuous "information is not a physical entity, therefore information is magic, therefore it needs a magic man" apologetic excrement. Oh dear. Time to reprise this.
Information is nothing more than the observational data available with respect to the current state of a system of interest. That is IT. Two rigorous mathematical treatments of information, namely Shannon's treatment and the treatment by Kolmogorov & Chaitin, are predicated on this fundamental notion. Indeed, when Claude Shannon wrote his seminal 1948 paper on information transmission, he explicitly removed ascribed meaning from that treatment, because ascribed meaning was wholly irrelevant to the analysis of the behaviour of information in a real world system. Allow me to present a nice example from the world of computer programming.
Here is a string of data (written as hexadecimal bytes):
81 16 00 2A FF 00
Now, to an 8086 processor, this string of bytes codes for a single 8086 machine language instruction, namely:
ADC [2A00H], 00FFH
which adds the immediate value of 00FFH (255 decimal) to whatever value is currently stored at the 16-bit memory location addressed by DS:2A00H (note that 8086 processors use segmented memory addressing, with DS implied as the default segment register unless the base address is of the form [BP+disp], in which case the default segment register is SS).
However, on an older, 8-bit 6502 processor, the above sequence codes for multiple instructions, namely the following sequence:
CLC
ASL ($00,X)
LDX #$FF
BRK
The first of these instructions clears the carry flag in the processor status register P. The second instruction takes the operand $00, adds to it the contents of the X register (8-bit addition only), and uses that computed address (call this N) as an index into the first 256 bytes of memory (page zero). The contents of address N and address N+1 together then constitute a 16-bit pointer into another location in memory. The 8-bit contents of this location are then shifted one bit position left (ASL stands for Arithmetic Shift Left). The third instruction loads the contents of the X register with the immediate value $FF (255 decimal). The third instruction, BRK, is a breakpoint instruction, and performs a complex sequence of operations. First, it takes the current value of the program counter (PC), which is now pointing at the BRK instruction, adds 2 to that value, and pushes it onto the stack (2 bytes are therefore pushed). It then pushes the contents of the processor status register P. Then, it loads the contents of the memory locations $FFFE and $FFFF (the top 2 locations in the 6502 address space) into the program counter and continues execution from there. The top end of memory in a 6502 system typically consists of ROM, and the hard-coded value stored in locations $FFFE/$FFFF is typically a vector to a breakpoint debugging routine in ROM, but that's an implementation dependent feature, and the exact contents of $FFFE/$FFFF vary accordingly from system to system.
To make matters even more interesting, the bytes also have meaning to a Motorola 6809 processor, viz:
CMPA #$16
NEG $2AFF
NEG ??
The first instruction is "compare accumulator A with the value $16 (22 decimal)". This performs an implicit subtraction of the operand $16 from the current contents of accumulator A, sets the condition codes (CC) according to whether the result is positive, zero or negative (and also sets other bits allowing more intricate comparisons to be made) but discards the actual result of the subtraction. The next instruction, NEG $2AFF, takes the contents of the memory location at address $2AFF (decimal 11,007), and negates it (so that a value of +8 becomes -8 and vice versa, assuming 2's complement storage). The next instruction is incomplete, hence the ?? operand, because the NEG opcode (the 00 byte) needs two following bytes to specify a memory address in order to specify which memory location's contents to negate. So, whatever two bytes follow our 6-byte stream will become the address operand for this NEG instruction.
Now, that's ONE stream of bytes, which has THREE different meanings for three different processors. Therefore ascribing meaning to the byte stream as part of the process of analysing transmission of the data is erroneous. Meaning only becomes important once the data has been transmitted and received, and the receiver decides to put that data to use. If we have three different computers receiving this 6-byte stream from appropriate sources, then the Shannon information content of each bit stream is identical, but our three different computers will ascribe totally different meanings to the byte stream, if they are regarded as part of a program instruction sequence. An 8086-based computer will regard the byte stream as an ADC instruction, the 6502-based computer will regard it as a CLC, ASL, LDX, BRK sequence, and the 6809-based computer will regard it as a CMPA, NEG, NEG sequence (and the latter will demand two more bytes to be transmitted in order to complete the last instruction).
Consequently, ascribed meaning is wholly irrelevant to the rigorous treatment of information. Creationists routinely introduce the error of assuming a priori that "information" and "ascribed meaning" are synonymous, which the above example refutes wholesale (along with thousands of others that could be posted if I had the time). Of course, creationists conflate information with ascribed meaning deliberately, because they seek to expound the view that information is a magic entity, and therefore requires an invisible magic man in order to come into existence. This is complete rot, as the Shannon and Kolmogorov/Chaitin analyses of information demonstrate readily, not to mention Turing's large body of work with respect to information. All that matters, at bottom, is that the entities and interactions applicable to a given system of interest produce different results when applied to different states of that system. Information sensu stricto, namely the observational data available with respect to the current state of a system, only becomes "meaningful" when different states lead to different outcomes during appropriate interactions applicable to the system, and the only "meaning" that matters, at bottom, is what outcomes result from those different system states, which in the case of the computer data above, differs from system to system.
As a corollary of the above, all that matters, in a rigorous treatment, is the following:
[1] What states a physical system of interest can occupy;
[2] What interactions take place as a result of these physical states.
Indeed, this is the entire basis upon which Turing machines are founded. Turing machines consist, at their core, of a symbol state table, each element of which is indexed as a two-dimensional array by the symbol being read, and the current state of the machine performing the reading. Each of those elements contains the following:
[1] What symbol to replace the current symbol with at the current reading position;
[2] What movement to perform in order to change the reading position and acquire the next symbol;
[3] What state the machine should adopt after performing the above operations.
This blind, mechanical, deterministic system is the foundation upon which modern computing is built. ALL digital processors are, in effect, electronic embodiments of Turing machines, and they are manifestly physical entities - physical entities, moreover, that are capable of processing information at high speed. At bottom, all that they are doing is manipulating the flow of electrons within their semiconductor substrates, yet this manipulation of electrons allows us to perform such tasks as computing fast Fourier transforms, run simulations of everything from weather systems to military aircraft, and, for those familiar with the Isabelle suite of software, allows us to check mathematical proofs for consistency and rigour. No magic was required to put Isabelle together, all that was required was for sufficiently informed humans to work out what processes would result in the relevant end product, and instantiate that as a huge collection of bits, once again stored in an appropriate medium. The point here being that if you change the physical arrangement of that medium, you change its information content. The physical arrangement of any storage medium determines the information content of that medium. What another system does with that content, of course, is again a function of its physical constitution, and the interactions permitted by that physical constitution, which in turn leads to the conclusion that ascribed meaning is nothing more than the particular set of interactions arising from the physical structure of the entity ascribing the meaning. Therefore creationist attempts to conflate ascribed meaning with information fail not only for lack of rigour, but fail because at bottom, ascribed meaning itself can be traced to a physical basis with respect, at least, to Turing machines and any entity that can be modelled by them.
Now, Perry Marshall (a creationist who is fond of specious "information" pseudo-arguments) erects the bogus argument that DNA is a "code". This IS bogus. DNA is simply an organic molecule that is capable of existing in a large number of states, each of which results in a different outcome with respect to the chemical interactions that the molecule takes part in. Because it can exist in a large number of states, because those states are all associated with specific, systematic interactions (such as the production of a particular protein after transcription), and because those states are coupled to those systematic and well-defined interactions in a largely one-to-one manner (for the time being, I'll leave to one side complications such as selenocysteine, which were afterthoughts grafted onto the original system), they can be treated in an information-theoretic manner as if they constituted a "code", because doing so simplifies our understanding of those systematic interactions, and facilitates further detailed analysis of that system. That, again, is IT. The idea that DNA constitutes a "code" intrinsically is merely a baseless creationist assertion resulting from deliberate apologetic misrepresentation of the code analogy. A misrepresentation that itself is then subject to rampant discoursive misuse, because the argument erected consists of:
[1] DNA is a code (unsupported baseless assertion);
[2] All codes are produced by "intelligence" (deliberately omitting the fact that the only "intelligence" we have evidence of that produces codes is human intelligence);
[3] Therefore an "intelligence" produced DNA (the inference being that this "intelligence" is supernatural, which doesn't even arise as a corollary from [2] when one factors in the omitted detail, that the only "intelligence" we have evidence for as a code producer is human, and therefore natural, intelligence).
This argument is fatuous as it stands, even without factoring in extra scientific knowledge that has been acquired in relatively recent times, but when we do factor in this knowledge, it becomes absurd to the Nth degree. That scientific knowledge consists of at least eleven scientific papers demonstrating that the "genetic code" is itself an EVOLVABLE ENTITY. The eleven papers in my collection are:
A Co-Evolution Theory Of The Genetic Code by J. Tze-Fei Wong, Proceedings of the National Academy of Sciences of the USA, 72(5): 1909-1912 (May 1975)
A Mechanism For The Association Of Amino Acids With Their Codons And The Origin Of The Genetic Code by Shelley D. Copley, Eric Smith and Harold J. Morowitz, Proceedings of the National Academy of Sciences of the USA, 102(12): 4442-4447 (22nd March 2005)
Collective Evolution And The Genetic Code by Kalin Vetsigian, Carl Woese and Nigel Goldenfeld, Proceedings of the National Academy of Sciences of the USA, 103(28): 10696-10701 (11th July 2006)
Emergence Of A Code In The Polymerisation Of Amino Acids Along RNA Templates by Jean Lehmann, Michel Ciblis and Albert Libchaber, Public Library of Science One, 4(6): e5773 (DOI: 10.1371/journalpone.0005773, 3rd June, 2009)
Evolution Of Amino Acid Frequencies In Proteins Over Deep Time: Inferred Order of Introduction Of Amino Acids Into The Genetic Code by Dawn J. Brooks, Jacques R. Fresco, Arthur M. Lesk and Mona Singh, [/i]Molecular and Biological Evolution[/i], 19(10): 1645-1655 (2002)
Evolution Of The Genetic Code: Partial Optimisation Of A Random Code For Robustness To Translation Error In A Rugged Fitness Landscape by Artem S. Nvozhilov, Yur I. Wolf and Eugene V. Koonin, Biology Direct, {b]2:[/b] 24 (23rd October 2007)
Importance Of Compartment Formation For A Self-Encoding System by Tomoaki Matsuura, Muneyoshi Yamaguchi, Elizabeth P. Ko-Mitamura, Yasufumi Shima, Itaru Urabe and Tetsuya Yomo, Proceedings of the National Academy of Sciences of the USA, 99(11): 7514-7517 (28th May 2002)
On The Origin Of The Genetic Code: Signatures Of Its Primordial Complementarity In tRNAs And Aminoacyl-tRNA Synthetases by S. N. Rodin and A. S. Rodin, Heredity, 100: 341-355 (5th March 2008)
Recent Evidence For Evolution Of The Genetic Code by Syozo Osawa, Thomas H. Jukes, Kimitsuna Watanabe and Akiro Muto, Microbiological Reviews, 56(1): 229-264 (March 1992)
Rewiring The Keyboard: Evolvability Of The Genetic Code by Robin D. Knight, Stephen J. Freeland and Laura F. Landweber, Nature Reviews Genetics, [b]2: 49-58 (January 2001)
A Simple Model Based On Mutation And Selection Explains Trends In Codon And Amino-Acid Usage And GC Composition Within And Across Genomes by Robin D. Knight, Stephen J. Freeland and Laura F. Landweber, Genome Biology, 2(4): research0010.1–0010.13 (22nd March 2001)
This collection of papers is incomplete, as more have been published in the relevant journals since I compiled this list.
So, since we have peer reviewed scientific papers demonstrating that the "genetic code" is itself an evolvable entity, and indeed, since scientists have published experimental work investigating the behaviour of alternative genetic codes arising from this research, the idea that an invisible magic man was needed for this is recrudescently nonsensical.
I think this covers relevant important bases.
Signature temporarily on hold until I can find a reliable image host ...