Discussion from "Calilasseia - CREATIONISTS-READ THIS"

Incl. intelligent design, belief in divine creation

Moderators: Calilasseia, DarthHelmet86, Onyx8

Re: Calilasseia - CREATIONISTS-READ THIS

#201  Postby Царь Славян » Dec 16, 2010 11:39 am

I can see you won gold at the olympics of bullshit. Speaking of basics... we'll have a look at your human+horse reproduction below. Until that, please provide empirical evidence of a mechanism that would prevent RM+NS from producing the results attained in the laboratory. I'll be waiting.
Depends on what results you want to get.

The level of moronity in this repsonse is staggering.
Please elaborate on this objection. Are you honestly telling me that you think demonstrating common ancestry by molecular phylogeny is contingent on inter-species reproduction? That's the most absurd strawman of evolutionary postulates I have EVER heard. I'd like to see you explain that, in detail.
I told you already. If you have never known that two different species have been able to interbreed, saying that they could becasue they are similar means nothing.

Provide citations to demonstrate that this should be assumed to be the case pr. definition.
Why shouldn't it be? That's the best current explanation. Do you have a better one?

It doesn't follow from this that all ncDNA is or should be assumed to have function, it just means it can. You seem to have a tendency to draw vast and premature conclusions.
No, obviously not all of them do. Some do, some don't. I simply used this example to show you that this is one possible explanation.

Well basically yes, it is. The gene changed and selection had the opportunity to act upon the change. Get over it.
But the mutation was cause by a mechanism. It wasn't simply a random copying mistake.

Another blind assertion. Can you even do anything else at all? Going to start citing real science at some point? (Dumbski's made up blind assertions don't constitute science either).
No, it's a fact. Do you know what genetic redundancy is?

For which you would need empirical evidence. Got any? No?
Evidence for what? We aren't talking about a specific case. I'm talking in general. If evolution is guided by an intelligent agent, than that's called intelligent design.

It is a rotary motor. So?
Yes, and all rotary motors we know of are designed. So is the molecular one, right?

Demonstrate by citation in the peer reviewed litterature, empirical evidence that nature would not be capable of doing the same selection. Or just shut up with your bullshit.
Stop evading what I said. I said that what scientists did in the lab, is called guided evolution and is a form of intelligent design.

I don't care about what you think is reasonable or not. Your opinions and spurious probability calculations count for litterally nothing. NOTHING. What matters is only what you can demonstrate by empirical observation and experiment. Get to work or shup up.
So, for the 50'th time, provide empirical evidence of a mechanism that would prevent accumulation of mutations over time or shut it.
If you don't show me the relevant probabilistic resources you can't claim that evolution can do anything.

Cars or computers don't reproduce themselves imperfectly. That's why. Until they do, your analogy will continue to be invalid no matter how you try to obfuscate it with fancy rethoric.

Organisms reproduce through imperfect replication of genetic material. Until you find entities that do the same, shut up with your shitty and unfitting analogies.
But the difference above does not imply that organism were not designed.

Only when you start by demonstrating where in chemistry a phenomena defies the laws of physics. If you can do that, we can start having a conversation. Go ahead... a Nobel Prize awaits you.
I haven't seen one.

Except that the following is a direct claim that evolution can not explain the origin of large amounts of FSC

"But evolution does not account for the origin of large amounts of FSC in living organisms. It can only account for small amounts." - Царь Славян

So I ask you again, what mechanism is preventing evolution for accounting for this? Provide citations in the peer reviewed litterature of empirical research or retract your claim.
Oh that! Ell yeah, a lack of probabilistic resources would do that. Under the model that Abel has produced evolution is severely limited. But you don't have to use his model, please provide a better one.

Imperfect replication of genetic material is causing accumulation of changes in allele frequencies in populations over generations. This is the basis for molecular phylogenetics. Provide evidence of a mechanism that would prevent such changes from accumulation over deep time, or retract your claim.
But this only works for species that have been known to reproduce. If two different species have never been known to reproduce, then their similarity means nothing.

The limit is time in relation to mutation rates. But that limit is extremely high. Frameshifts and shufflings have incredible potential for the production of novel functional sequence. Any intelligent person can see this. Since I don't think you are stupid, it logically follows that you are willfully dishonest.
OK, how high. I want a number. Show me the model. Show me the relevant probabilistic resource.
You said time. OK, how much time did evolution have? You also said mutations. OK, so how many mutations could have occured in all the time evolution had?

Protein function is contingen on (almost)impossible to predict environmental circumstances. What might be a nonfunctional sequence now may be highly desirable later. This elementary fact renders your specious calculation irrelevant and useless.
I mentioned the word protein NOWHERE. I talked about time, quantum transactions and number of particles in the universe. Where did you read the word proteins?
User avatar
Царь Славян
Banned Troll
 
Posts: 294

Print view this post

Ads by Google


Re: Calilasseia - CREATIONISTS-READ THIS

#202  Postby GenesForLife » Dec 16, 2010 11:41 am

Царь Славян wrote:
It came from the quadruple duplication followed by the conversion of a ncDNA sequence to a coding DNA sequence, the mutation added a new protein to the genome, new functional protein, your own citation makes this clear.
Yes, and the ncDNA regions regulated that process.


Errant nonsense, do you know what it takes to convert ncDNA to a protein coding sequence, a point mutation that converts a codon to a start codon. Again you make the fundamental error of asserting that all ncDNA is regulatory, aka promoters. I can recall miRNA, shRNA and general transcripts as ncDNA that isn't involved in regulating exons, as you assert it does. Also provide evidence that ncDNA regulated the acquisition of protein producing capability in this case.

The gaining of the exon is the new function, fucking learn this lesson.

Just an assertion, no evidence.
It's an assertion that computer measure memory space of an HDD in bits? Wow...


Bits MUST be Shannon information?

And the best current knowledge, as determined by the scientific consensus on the issue, indicates nothing in favour of design.
Exactly, no archeological artefac we ever found can be said to be designed. Everything can be explained by chance, right? Science stopper if I ever saw one.


Stochastic combinations of known natural processes, shove this strawman back up your rear.

When mutations that alter coat protein functions are followed by an increase in fitness, and this can be used to build up a step by step tree the association is clear.
Yes it can happen, but it does not mean that it HAS GOT TO happen.


But they have traced this happening, game fucking over. Besides, there is no extant evidence for genetic redundancy in the protein they evolved in the bacteriophage they used, if you have evidence for such genetic redundancy, present it.

And the scientific consensus that reflects the state of extant knowledge is evolutionary, fucking deal with it.
This is irrelvenat. Please provide me a model of the universe we are going to be working on from now on.


I don't need to. The burden of proof is on you.

Declaring it is without backing it up with apposite citations from the peer-reviewed literature, you do realize that making unsupported uncited assertions when indulging in scientific discourse will result in your assertions being rejected, I hope.
As I said, I use the definitions that is used by Abel, and he published his findings in PR journals.


And I showed Adami et al provided another approach, and it was published in a better journal and backed up by simuations as opposed to defining wibble into thin air.

It does not, it only includes replicating a specific subset of natural phenomena to use as a marker, it isn't that artificial selection uses intelligence in itself, but the implication that this therefore means that natural equivalents don't exist or that NS cannot accomplish the same results as AS that I have a problem with.
Learn the basics. We can not continue untill you do. Artificial selection = intelligent design.


Fuck off then.

Is there an intelligent input at any point during the experiment that is goal directed? YES.
When is does it occure? During the selction.
This is by definition intelligent design.


I will define things as I want to blah blah blah.


Fucking read them, and the second link is to a post that I wrote explaining some of the literature on the validity of phylogeny, the Theobald paper also mentions specific instances of consilience between multiple phylogenetic methods, which would be impossible without common ancestry, which by the way is demonstrated fact in that new species have been observed forming in speciation events.
Consiliance means similarty. I told you already that similarity does not imply common descent.


[1] Scientists set up evolution to happen in the lab.
[2] They develop phylogenetic methods based on inheritance.
[3] Phylogenetic model produces a predicted result.
[4] Results are exactly as predicted, stage by stage.

The method used to develop phylogeny and what happened during evolution converged.
Phylogeny replicates descent from a common ancestor as supported evidentially by direct experimental confirmation.

To put it simply, if common descent was false, the model had no reason to match what was observed empirically and directly.

Game fucking over.
This is true for species that we know are related. We know that some species can reproduce. People and horses can't, so their similarity does not point to common descent. If they were never able to reproduce then their similarity obviously does not imply common descent.


Still persisting with a fucking strawman, I see.

ncDNA can produce new proteins and gain information if the frame is opened by a point mutation.
variations in ncDNA in regulatory regions through mutation can affect transcription factor binding.
there are ncDNA classes not involved in gene regulation either.

To go from regulation to coding is a different thing altogether, do you understand the basics of genome organization? and what differentiates ncDNA that produces proteins from regulatory ncDNA? Heard of transcribed sequences that are not translated? that kind of ncDNA is different from promoters and enhancers, which are not transcribed.
But the information to do so has always been there. Yes, you simply need to open the frame. That's ONE mutation.


The information to produce proteins hasn't been there, it gained the information to produce proteins, functionality gained, game over.

And other than a fucking empty assertion you have nothing of substance, quelle surprise.
I told you already. Transposons are not a part of RM + NS. They are a separate mechanism.


And your peer reviewed citation supporting this flatulent assertion is?

Random = statistically indeterminate.
Mutation = heritable change in DNA
That is fucking it.

I did not identify that way, there is no evidence to suggest it is a case of redundancy, put up or shut the fuck up.
Blind assertions do not constitute evidence, they constitute blatant fuckwittery for doctrine. Of course, do you also have evidence for any genetic redundancy in the wild type phage? Actually, do you have any understanding of what the molecular tests for genetic redundancy are, and how to do them? Hint - they would necessitate that loss of function mutations in one of the two redundant copies still not affect the phenotype drastically, and you presented absolutely no evidence for this being the case instead choosing to assert something ex recto.
I infered that, because we know that this can happen. Do you have a better explanation?


You cannot "infer" that, put the evidence up or shut the fuck up.

Models work or they do not, and you have presented sweet fuck all evidence that your models work.
I'm still waiting for your model of the universe.


You have a model that you proposed, now fucking justify it.

Heard of speciation? Which is a macroevolutionary process, and has been observed?
If you define it that way sure it is. But what does it mean? Nothing, except that two species simply can't interbreed anymore. They didn't gain anything by that.


It separates gene pools into descendent species, it means that related species which are related by descent no longer need to be able to reproduce together, as your strawman claimed they need to for them to be related, fucking own goal there.

And your interpretation is evidentially supported by anything more than a flatulent bit of fuckwitted nonsense?
We know that it can happen, so we infer that this is the case here. Why? Becasue it's the best current explanation.


Still no evidence, put up or shut the fuck up.

And you qualify this statement with what evidence? none, and since the direct evolutionary path was observed as functionality was gained in terms of infectivity, this refutes your nonsense wholesale. Secondly, what are your credentials to even make a blind assertion like this?

Thirdly, the different mutants as they evolved would not show the path that they did because redundancy means this variant would have NO difference in fitness from the wild type, or that substitutions in the protein rendered defective would result in evolutionary alteration, this rules redundancy out, fucking deal with it and next time don't make shit up.
Unless the functionality was regained step by step from the redundant regions. Then the pathway would be visible just like it is.


And your evidence to support this claim is?

Yup, and one that undergoes mutations and is selected based on the functional impact of those mutations.
Okay, so if you saw such a computer, would you conclude that it was designed?


Only if it was observed being designed.

I asked if your stance was that of guided evolution , I did not ask what you think of guided evolution, learn to tell the difference.
You should have been more precise.


Your lack of comprehension is not my problem.

An intelligent designer tweaking genomes isn't, stop being fucking obfuscatory.
I'm being general. By your logic we couldn't infer that the Rosetta stone is designed, becasue we have no evidence of people making Rosetta stones. Which is pretty much a logical fallacy. Yeah you could then turn around and say that we do have experience of people making inscriptions in stone. But that would just be a more general description. And that would be exactly what I did. I don't have to have an experience of someone tweaking the genome to infer that some intelligence in general could have done it. Because I know what intelligence can do in general.


There is evidence of existing entities, such as humans, making such things as Rosetta stones, there is no evidence for designers macromanaging biodiversity. Fucking learn this.

And you showed no evidence that your model adequately represents actual living organisms and have represented living organisms for the bulk of evolutionary history. You know you cannot calculate averages without having a full sampling , do you not? you haven't even presented evidence for a limited sample size.
And that is why is told you that we are going to take into account all the matter on Earth. If that is no enough, let's take into account the whole universe. I presented you with my model. Do you agree with it or not?


And how does taking stats for the sum total of matter address the issue of living organisms rigorously?

There is no false trilemma between chance, law and design, design at the moment is at best a hypothesis, currently unfalsifiable and untestable.
If we falsify chance a law, and if we show that an event in question is one that design is capeable of producing then yes, design is the best current explanation. And I already said that to falsify design you either have to show that there is a natural law that accounts for the event, or it's possible to account for it by chance.
Look at snowflakes. We know how they form. They are not designed. The design inference is falsified for the snowflake.


We know mutations occur, we know they can produce new DNA from pre-existing templates, starting with template based replication in spontaneously forming RNA chains. The quasar example shoots the idea of priveleging design down, fucking repeating this when the argument erroneously attributes natural things such as quasars to design by default is downright inane.

In other words you are asserting a conclusion where evidence doesn't matter any more, hu-fucking-rrah
No, I'm simply saiying that if something was guided to evolve. Then it did evolve, and it was designed. Evolution des not rule out design. Neither does design rule out evolution.


In other words you assert design regardless of whether the evidence for natural evolution is present or not, as I said, asserting a conclusion where evidence doesn't matter.

Not really much rationale in separating law from pure chance, chance can act on laws and they comprise an integrated class of phenomena.
We need to separate them becasue snowflakes do not form by chance. Objects do not fall to the ground by chance either.


Snowflakes form due to the interaction of multiple physical and chemical properties in stochastic proportions, just as mutagenic processes act on genomes in stochastic proportions, your assertion fails.

This means that you can produce any sequence starting with primordial sequences through known mutagenic processes alone, all stochastically acting and natural, no magical dicking required.
And where are the limits of this in the natural world? What are the probabilistic resources you are claiming can produce this?


I told you what the limits of the mutagenic processes themselves are per instance of mutation, with no change in genome length with point mutations, and exponential increases in cases of whole genome duplications per instance.

1) They are molecular.
2) One of the definitions of machine is an intricate organization that accomplishes its goals efficiently;

Here the goal is what the biomolecular assembly does, since it offers a selective benefit, and efficiency is relative to precursors, as it improves it becomes more efficient, and even a suboptimal molecular machine is more optimal than one that isn't.
So the analogy is correct even though they do not run on gas like an engine we find in cars?


A description is not an analogy, they say they are molecular machines , they don't say they're just like a molecular machine.

And you haven't backed this up with precise quantitative data, just another blind assertion.
I provided you with my model of the universe.


And I have no reason to accept it until you've modelled it to evolution and published the findings, I stick with the scientific consensus.

To the degree of genetic code alteration that the artificial selection experiments achieved. Natural mutations however can do more since the range of mutations available artificially is less than what can be generated naturally.
You need to provide me with the relevand probabilistic resources.


The probability is 1, now bugger off.

Because the mechanisms of development are the ones being discussed here, the development of organisms as opposed to computers being externally put together.
How does that have any bearing on them being designed or not?


Because the nature of development introduces natural variables due to imperfection that computer manufacture usually does not.

And I counter it by using this
As you can clearly see, they used Shannon inforamtion. Yes, obviously it can increase. A simple duplication of a gene increases SI in the genome. But it does not account for functional complexity. Oh and look at that, they say that Shannon information measures how information can be STORED. They talk nothing about noise or transmission...


It increases sequences in the genomes
It frees things up for further mutation
mutation generates functional complexity.


In Shannon's information theory (22), the quantity entropy (H) represents the expected number of bits required to specify the state of a physical object given a distribution of probabilities; that is, it measures how much information can potentially be stored in it.


PNAS is better than some obscure journal of theoretical biology.
PNAS wins.
That's nice...


Yup, PNAS has higher standards of rigour and professional standing, besides, I haven't seen Abel's paper being further endorsed by the scientific community, peer review is just the minimum standard. If you want examples of why it is just a minimum standard, have a look at Lynch et cetera who argue for stochasticity in developmental networks and Coyne argues for selection shaping the same, different viewpoints that have both passed peer review but obviously two that cannot be both simultaneously correct.

Also demonstrate that Shannon entropy and Shannon information are the same.

And in each case the conclusion was that something non-human or something of a type not shown to exist had done the murder? You are not leaving biodiversity as unsolved, you are actively prescribing design. Unsolved deaths basically = no verdict.
The verdict is that they were killed.


The verdict is they died, and of course if they were murdered there would be evidence of causal entities murdering those people.

William Goebel, American politician who was shot and mortally wounded on the morning of 30 January 1900 by an unknown assailant in Frankfort, Kentucky one day before being sworn in as Governor of Kentucky. The next day the dying Goebel was sworn in and despite 18 physicians attending him, died the afternoon of 3 February 1900. Goebel remains the only state governor in the United States to be assassinated while in office
As you can see, they concluded that he was murdered, even thought hey do not know by whome. But the point is that the infered design.


They know that there was an unknown assailant because the assailant used a known method that is characteristic of known entities (humans) and had been observed before with other humans using similar contraptions to commit murder. Now your designer will have to be shown to exist and using mechanisms of design before design can be inferred.

It is a set of physical processes individually acting under physical laws but combining stochastically in ways that have not been described yet.
Yeah I know, you don't have to repeat yourself. The point remains that you can't derive biological functions from evolution since it's not a natural law.


You can derive the fact that new genomic content is produced using known evolutionary processes.

They take real world examples, for instance, and demonstrate that Dembski's model gave results of design even when natural processes were involved, again, do not expect me to do your reading for you.
Where? Quote me the relevant part.


Read the paper, why should I quote it again, I led you to the link.



Not a peer reviewed paper, if it passes it will be minimally credible.

It could be, but it doesn't have to be so. We do not know.


Then you cannot assert design.

Have you actually seen evolution produce a full genome? No, you haven't. You assert it. Untill you show me the probabilistic resources under which evolution operates, you can't claim it produced even a single genome.


Every instance of ploidy counts. Try cryptic speciation in Hyla versicolor. New genome produced from an old one, and yes, variation in the diploid number constitutes novelty in genomes.

Does observing speciation mean that humans were ever related to horses?


It means that if you go far back enough in time every species has shared a common ancestor, more related species have more recent common ancestors than more distantly related ones.

Did they use Dembski's method to infer that? How was it use. You will have to give me a bit more info than that.


Read the paper.

But not every sequence is functional. Simply duplicating the whole genome doesn't produce new biological functions. You need to show me the relevant probabilistic resources.


Every sequence can produce something functional, you need to show me that the extant functional space in extant organisms couldn't have arrived through mutations, to do this you need to calculate the total number of instances mutagenic events have taken place and the total proportion of functional elements produced using that.

I answered their main objections, or did you miss that? And yes I called it garbage because it is. Other articles you posted were not, but this one is.


You answered nothing with the support of peer reviewed papers or citations.

I don't care what he said, or if he is an authority or not. Refute his claim or be quiet about it.


Substantiate his claim as being empirically valid or shut the fuck up.
I can make claims too "Show me your designer or be quiet about it"
See?

Either show that an event can be accounted by natural law, or show that an event is probable, and the design inference for an event is falsified.


Show me the designer and you can directly attribute design as a priveleged explanation.

Comes first in what? I said that law comes first, chance second, and then if those two are falsified, we infer design if the pattern matches the one we can infer design for.


Shallit and Ellsberry go on to state that whenever chance or law cannot account for all variables yet, design is to be preferred according to Dembski. This is a flawed approach.

If it was produced by an intelligence then it is intelligent. We are not using the word intelligent to mean smart or good or perfect or optimal. It simply means that there was an intelligent cause. Regardless of optimality.


Intelligent causes have to exist to contribute to the processes they guide. Mutations exist, your designer has no evidence for existence.

The same thing was published in his book the Design Inference, which was peer reviewd. This is simply a newer version.


Books are not peer reviewed, they do not pass through the stringent editorial policies of scientific journals.

As for you paper, unfortunately, I took a look at it and found it to be garbage again. Please stop posting such nonsense. This is what the author said.

Hidden in this last step is a subtle error with enormous consequences.
What Lloyd[2] actually hypothesizes is that “The universe can have
performed no more than 10^120 ops on 10^90 bits.” The number of bits
10^90 is “an amount of information equal to the logarithm of its number
of accessible states”, so the number of accessible states is 2^10^90 . It
is this number which bounds the replicational resources, rather than
10^120. Dembski has, in effect, taken the logarithm twice when he should
have taken it only once.


This is all wrong. Since as the original Seth Lloyd paper states, the universe can be modeled as consisting of 10^90 bits, which in about 15 billion years, produced 10^120 operations. And to get the amount of information that could reasonable have been produced you take the log function of the number of operations you don't use the exponent. Why? Well obviously.

How many states does a coin have? It has 2 states. That means that it represents 1 bit. How did we get that Well we took the log2 function of 2. Log2(2) = 1.

How many states do 2 coins have? Well, they have 4 states. How many bits of information is that? That's 2 bits. How did we get that? We use the log2 function of 4. Log2(4) = 2.

Let's continue...

Log2(8) = 3
Log2(16) = 4
Log2(32) = 5
Etc...

And now we come to the number of all operations that were performed on on all particles in the universe since the Big Bang. These are all teh states that the universe has been in. And that's 10^120. How many bits are that? Well let's take the log function...

Log2(10^120) = 398.63 = 400.

Case closed. Please stop posting garbage.


This is the number of times a quantum interaction could have occured, not the combinations of operations that are possible, and it is the combinations of operations that produce outcomes, not the number of operations or the number of states that have taken place which determine what can take place, it is the combinations of bits that have taken place during those 10^120 operations that are deterministic, and the authors continue along this line.

That is where the error apparently is, did you not read the fecking paper?

“The universe can have
performed no more than 10120 ops on 1090 bits.” The number of bits
1090 is “an amount of information equal to the logarithm of its number
of accessible states”, so the number of accessible states is 2^10^90 . It
is this number which bounds the replicational resources, rather than
10120. Dembski has, in effect, taken the logarithm twice when he should
have taken it only once.

It is worth noting that this is consistent with Dembski’s own definition
of resources. Compare, for example, how he computes the specificational
resources for the bacterial flagellum. We are merely computing
the replicational resources the same way.


they make the necessary substitution and remark what happens to the shift,

, a shift which renders it useless. It basically says we should infer non-randomness if
we see any event or pattern that is more unlikely than the entire state
of the universe. But clearly, no subset or partial view of the universe
can be more unlikely than the universe as a whole. (This conclusion is
trivial, and can be proven quite easily without going through Dembski’s
elaborate constructions. Simply note that the probability of the rest of
the universe, independent of the part under consideration, cannot be
greater than 1.) Thus it will never be possible to infer non-randomness
by this method.


Since they back their calculations up with indpendent lines of analysis, and all you do is just assert. Your approach fails.
as both instances expose the aforementioned error.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#203  Postby GenesForLife » Dec 16, 2010 12:24 pm

They also mention this

Dembski is attempting to calculate a kind of uncertainty using something
like Shannon’s formula. To do this, he needs to take the logarithm
of the number of possibilities. But Lloyd’s 10120 bit-operations and 1090
bits are already the logarithms of the number of possible computational
sequences and the number of possible register states respectively
. To
insert either of them inside the logarithm in his formula is a mistake;
what needs to go there is the thing (sequences or states) of which they
are the logarithm.


Do you have a link to Lloyd's original paper?
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#204  Postby Rumraket » Dec 16, 2010 12:26 pm

Царь Славян wrote:
I can see you won gold at the olympics of bullshit. Speaking of basics... we'll have a look at your human+horse reproduction below. Until that, please provide empirical evidence of a mechanism that would prevent RM+NS from producing the results attained in the laboratory. I'll be waiting.
Depends on what results you want to get.

Your answer doesn't make sense, please elaborate. What do you mean, results? How about evidence of that limiting mechanism you keep claiming exists, got any?

The level of moronity in this repsonse is staggering.
Please elaborate on this objection. Are you honestly telling me that you think demonstrating common ancestry by molecular phylogeny is contingent on inter-species reproduction? That's the most absurd strawman of evolutionary postulates I have EVER heard. I'd like to see you explain that, in detail.
I told you already. If you have never known that two different species have been able to interbreed, saying that they could becasue they are similar means nothing.

Quote directly where anyone claimed humans and horses could interbreed or shut up. This is getting increasingly hilarious...
It is becoming painfully obvious you haven't got a clue about molecular phylogenetics.

Provide citations to demonstrate that this should be assumed to be the case pr. definition.
Why shouldn't it be? That's the best current explanation. Do you have a better one?

No it's not. There is no evidence to suggest that all ncDNA should be assumed to have function. Provide evidence to back up your assertion or retract your claim.
And yes I have a better explanation : http://www.talkorigins.org/faqs/molgen/
Read it and educate yourself.

It doesn't follow from this that all ncDNA is or should be assumed to have function, it just means it can. You seem to have a tendency to draw vast and premature conclusions.
No, obviously not all of them do. Some do, some don't. I simply used this example to show you that this is one possible explanation.

Wait a minute, above here you just asked why it shouldn't be assumed... but now you admit that it's not? What exactly is your position? Do you even know?

Well basically yes, it is. The gene changed and selection had the opportunity to act upon the change. Get over it.
But the mutation was cause by a mechanism. It wasn't simply a random copying mistake.

Neither was it a preplanned event with conscious foresight. None of what you said changed the fact that there was genetic change, resulting in phenotypic change and nature had the opportunity to select upon it. The level of goal-post shifting and obfuscation you employ is pathetic.

Another blind assertion. Can you even do anything else at all? Going to start citing real science at some point? (Dumbski's made up blind assertions don't constitute science either).
No, it's a fact. Do you know what genetic redundancy is?

Simply claiming it as fact doesn't make it so. Evidence please.

For which you would need empirical evidence. Got any? No?
Evidence for what? We aren't talking about a specific case. I'm talking in general. If evolution is guided by an intelligent agent, than that's called intelligent design.

And I'm asking if you have direct empirical evidence that evolution is guided by an intelligent agent. Do you?

It is a rotary motor. So?
Yes, and all rotary motors we know of are designed. So is the molecular one, right?

Obviously not since we know the flagellum evolved. Do you have any evidence of design taking place in the construction of the flagellum? Do you have any evidence that RM+NS could not produce it?

Demonstrate by citation in the peer reviewed litterature, empirical evidence that nature would not be capable of doing the same selection. Or just shut up with your bullshit.
Stop evading what I said. I said that what scientists did in the lab, is called guided evolution and is a form of intelligent design.

I'm not evading anything. You are the one ignoring my request for a mechanism that would prevent nature from producing the results achieved in the laboratory.

I don't care about what you think is reasonable or not. Your opinions and spurious probability calculations count for litterally nothing. NOTHING. What matters is only what you can demonstrate by empirical observation and experiment. Get to work or shup up.
So, for the 50'th time, provide empirical evidence of a mechanism that would prevent accumulation of mutations over time or shut it.
If you don't show me the relevant probabilistic resources you can't claim that evolution can do anything.

Of course I can. All have have to do cite empirical evidence by experiment. We can simply sit our asses down and see evolution happening. You are under the burden of proof here and the burden is to provide a mechanism that would prevent observed evolution from accumulating over deep time. Put up or shut up.

Cars or computers don't reproduce themselves imperfectly. That's why. Until they do, your analogy will continue to be invalid no matter how you try to obfuscate it with fancy rethoric.

Organisms reproduce through imperfect replication of genetic material. Until you find entities that do the same, shut up with your shitty and unfitting analogies.
But the difference above does not imply that organism were not designed.

The evidence shows they evolved. The only way you are getting around that one is by showing evidence that refutes this. Got any?

Only when you start by demonstrating where in chemistry a phenomena defies the laws of physics. If you can do that, we can start having a conversation. Go ahead... a Nobel Prize awaits you.
I haven't seen one.

I'm not surprised.

Except that the following is a direct claim that evolution can not explain the origin of large amounts of FSC

"But evolution does not account for the origin of large amounts of FSC in living organisms. It can only account for small amounts." - Царь Славян

So I ask you again, what mechanism is preventing evolution for accounting for this? Provide citations in the peer reviewed litterature of empirical research or retract your claim.
Oh that! Ell yeah, a lack of probabilistic resources would do that. Under the model that Abel has produced evolution is severely limited. But you don't have to use his model, please provide a better one.

Arbitrary calculated limitations are irrelevant when empirical research easily demonstrates the evolution of functional sequences by quite small and mediocre populations of bacteria in laboratory experiments. Please provide a mechanism that would prevent such observed evolution from accumulating over deep time or shut up.

Imperfect replication of genetic material is causing accumulation of changes in allele frequencies in populations over generations. This is the basis for molecular phylogenetics. Provide evidence of a mechanism that would prevent such changes from accumulation over deep time, or retract your claim.
But this only works for species that have been known to reproduce. If two different species have never been known to reproduce, then their similarity means nothing.

There you go with that idiotic claim again. Evolutionary postulates relating to molecular phylogenetics are not contingent on a claim of possible interspecies reproduction. Why do you keep bringing this hilarious straw-man up?

The limit is time in relation to mutation rates. But that limit is extremely high. Frameshifts and shufflings have incredible potential for the production of novel functional sequence. Any intelligent person can see this. Since I don't think you are stupid, it logically follows that you are willfully dishonest.
OK, how high. I want a number. Show me the model. Show me the relevant probabilistic resource.
It is theoretically possible for a shuffling, duplication and frameshift event to take place multiple times at every generation of replication, it's unlikely but possible. You can work from there.

You said time. OK, how much time did evolution have?

3.5 billion years, minimum.

You also said mutations. OK, so how many mutations could have occured in all the time evolution had?

The maximum number of mutations in theory is staggering. In your own immune system, B-cells are maturing antibodies at a mutation rate of around one million times the one taking place at reproductive replication.
Additionally your calculation would be meaningless since the number of mutation events says nothing about the type or functionality of the mutation. A frameshift or shuffling event would have many orders of magnitude higher capacity for the generation of novel function than a substitution event.

Protein function is contingen on (almost)impossible to predict environmental circumstances. What might be a nonfunctional sequence now may be highly desirable later. This elementary fact renders your specious calculation irrelevant and useless.
I mentioned the word protein NOWHERE. I talked about time, quantum transactions and number of particles in the universe. Where did you read the word proteins?

Because I'm not an idiot I know that your fallacious calculation is structured to argue against the propability of producing a functional protein from scratch with your X number of maximum possible events. If not, then I don't even see how your maximum number of quantum transactions is even relevant. Please elaborate.
Half-Life 3 - I want to believe
User avatar
Rumraket
 
Posts: 13146
Age: 38

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#205  Postby GenesForLife » Dec 16, 2010 12:39 pm

Let us quote Seth Lloyd at the moment.


The above sections calculated how many elementary logical operations that can have
been performed on how many bits during various phases of the history of the universe.
As noted above, there are three distinct interpretations of the numbers calculated. The
first interpretation simply states that the number of ops and number of bits given here
are upper bounds on the amount of computation that can have been performed since
the universe began. This interpretation should be uncontroversial: existing computers
have clearly performed far fewer ops on far fewer bits. As Moore’s law progresses and as
computers use fewer degrees of freedom per bit and less energy per logic operation, the
number of bits and ops available will increase. Existing quantum computers already use
the minimum number of degrees of freedom per bit and the minimum energy per operation.
The maximum amount of computing power that will eventually be available to mankind
is a question of considerable technological and social interest.41 Of course, that maximum
computing power is likely to remain far below the amounts calculated for the universe as
a whole.


The third interpretation — that the numbers of bits and ops calculated here represent
the actual memory capacity and number of elementary quantum logic operations
performed by the universe — is more controversial. That the universe registers an amount
of information equal to the logarithm of its number of accessible states
seems reasonable.
But to say that 2 R Edt/π¯h is the number of elementary operations performed is equivalent
to saying that the universe performes an ‘op’ every time some piece of it evolves by an
average angle (or acquires an average phase) π/2 in Hilbert space. In fact, for a quantum
computer, this is a reasonable definition of an elementary quantum logic operations:
to perform each quantum logic operation requires energy E to be made available for a
time π¯h/2E. And whenever energy E is available to some degrees of freedom for time
π¯h/2E, the state of those degrees of freedom evolves by an angle π/2 in Hilbert space.
In addition, almost any interaction between degrees of freedom, including fundamental
interactions such as those in the standard model, supports quantum logic and quantum
computation.12−16,43 Accordingly, it seems plausible to identify an elementary quantum
logic operation with the local evolution of information-carrying degrees of freedom by an
angle of π/2. Whether or not this is a reasonable definition of an elementary operation for
the universe as a whole is a question whose answer will have to await further results on
the relationship between information processing and fundamental physics


Well, so the authors of the paper you rubbished were right in saying that Lloyd did use logarithms already for information equivalent. Care to explain why Lloyd is tentative wrt his conclusions, firstly, and secondly, establish that the above models apply to Newtonian scale physics instead of just models that are postulated to be quantum computers, and also account for the highlighted bit?
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#206  Postby Царь Славян » Dec 16, 2010 12:43 pm

Errant nonsense, do you know what it takes to convert ncDNA to a protein coding sequence, a point mutation that converts a codon to a start codon. Again you make the fundamental error of asserting that all ncDNA is regulatory, aka promoters. I can recall miRNA, shRNA and general transcripts as ncDNA that isn't involved in regulating exons, as you assert it does.

The gaining of the exon is the new function, fucking learn this lesson.
You keep putting words in my mouth. I never sad that all of them regulate gene expression. But anyway, even if this was a product of RM + NS, how much FSC was produced?

Bits MUST be Shannon information?
No they do not. How is HDD storage calculated? Do you know? Show me the equation.

Stochastic combinations of known natural processes, shove this strawman back up your rear.
So the Rosetta stone is a product of stochastic processes?

But they have traced this happening, game fucking over. Besides, there is no extant evidence for genetic redundancy in the protein they evolved in the bacteriophage they used, if you have evidence for such genetic redundancy, present it.
Traced it how? They saw a step by step process of gain in infectivity, and you conclude that it's cause by RM + NS. Where's the evidence?

I don't need to. The burden of proof is on you.
I already did. If you do not agree with it, provide your own. You can't say that evolution in nautre can produce even one single bit of information if you do not show that it has the relevant probabilistic resources.

And I showed Adami et al provided another approach, and it was published in a better journal and backed up by simuations as opposed to defining wibble into thin air.
They simply used Shannon information! LOL!

I will define things as I want to blah blah blah.
So let me get this straight, input of information by an intelligent agent is not by definition intelligent design?

Still persisting with a fucking strawman, I see.
I don't see a strawman anywhere.

The information to produce proteins hasn't been there, it gained the information to produce proteins, functionality gained, game over.
OK, let's say that it has gained new functional information. By what mechanism has it does so? And how much FSC was gained?

And your peer reviewed citation supporting this flatulent assertion is?

Random = statistically indeterminate.
Mutation = heritable change in DNA
That is fucking it.
It's common sense. Random mutations are copying errors. Transposons are mechanisms that cause muations.

You cannot "infer" that, put the evidence up or shut the fuck up.
How can you infer that it was RM + NS then?

You have a model that you proposed, now fucking justify it.
How exactly do you want me to justify it?

It separates gene pools into descendent species, it means that related species which are related by descent no longer need to be able to reproduce together, as your strawman claimed they need to for them to be related, fucking own goal there.
No, I said that at some point in time while humans were not humans an horses were not horses, but something else, and were related, they were supposed to be able to reproduce.

Still no evidence, put up or shut the fuck up.
Are you saying genetic redundancy is not real?

And your evidence to support this claim is?
Observation! If something is regained step by stpe then it should be visible in each and every step! It didn't become invisible all of a sudden.

Only if it was observed being designed.
Which means that your logic is invalid and stops science. By my logic we could infer design.

Your lack of comprehension is not my problem.
You mistook me for yourself.

There is evidence of existing entities, such as humans, making such things as Rosetta stones, there is no evidence for designers macromanaging biodiversity. Fucking learn this.
LOL, didn't you even read my quote till the end? I already answered that objection. Here, read it again, slowly...

I'm being general. By your logic we couldn't infer that the Rosetta stone is designed, becasue we have no evidence of people making Rosetta stones. Which is pretty much a logical fallacy. Yeah you could then turn around and say that we do have experience of people making inscriptions in stone. But that would just be a more general description. And that would be exactly what I did. I don't have to have an experience of someone tweaking the genome to infer that some intelligence in general could have done it. Because I know what intelligence can do in general.

And how does taking stats for the sum total of matter address the issue of living organisms rigorously?
Becasue living organisms are a subset of all the matter on Earth. And if this matter can produce more than X bits of information than living organism can't also.

We know mutations occur, we know they can produce new DNA from pre-existing templates, starting with template based replication in spontaneously forming RNA chains. The quasar example shoots this assertion down, fucking repeating this when the argument erroneously attributes natural things such as quasars to design by default is downright inane.
Yeah, but we have to show that such meachanism have the probablistic resorces to produce a given event. If they can't then we should infer design.

In other words you assert design regardless of whether the evidence for natural evolution is present or not, as I said, asserting a conclusion where evidence doesn't matter.
No. I'm simply saying that one does not rule out the other. They are not mutually exclusive.

Snowflakes form due to the interaction of multiple physical and chemical properties in stochastic proportions, just as mutagenic processes act on genomes in stochastic proportions, your assertion fails.
Wrong. We have the equation for crystalization. We know how they form. It's not a stochastic process it's a natural law. We have no equation for biological information.

I told you what the limits of the mutagenic processes themselves are per instance of mutation, with no change in genome length with point mutations, and exponential increases in cases of whole genome duplications per instance.
I asked you about probabilistic resources.

A description is not an analogy, they say they are molecular machines , they don't say they're just like a molecular machine.
Oh, so they are machines. Well, all machines we do know of have been designed. So molecular machines have also been designed, right?

And I have no reason to accept it until you've modelled it to evolution and published the findings, I stick with the scientific consensus.
Yeah, that's not gonna work. This is a forum, and here we go with what we have. If you don't like it, tough luck. You lost.

The probability is 1, now bugger off.
Show me the calculation.

Because the nature of development introduces natural variables due to imperfection that computer manufacture usually does not.
Does that mean that such a mechanism was not designed?

It increases sequences in the genomes
It frees things up for further mutation
mutation generates functional complexity.
No, not functional complecity, but Shannon complexity.

Yup, PNAS has higher standards of rigour and professional standing, besides, I haven't seen Abel's paper being further endorsed by the scientific community, peer review is just the minimum standard. If you want examples of why it is just a minimum standard, have a look at Lynch et cetera who argue for stochasticity in developmental networks and Coyne argues for selection shaping the same, different viewpoints that have both passed peer review but obviously two that cannot be both simultaneously correct.
I don't care where it's published. It' either correct or not.

The verdict is they died, and of course if they were murdered there would be evidence of causal entities murdering those people.
It says MURDERED n the quote I posted.

They know that there was an unknown assailant because the assailant used a known method that is characteristic of known entities (humans) and had been observed before with other humans using similar contraptions to commit murder. Now your designer will have to be shown to exist and using mechanisms of design before design can be inferred.
Intelligence is a known cause. Genetic engineering is a known method.

You can derive the fact that new genomic content is produced using known evolutionary processes.
Show me the equation for that.

Not a peer reviewed paper, if it passes it will be minimally credible.
Design Inference was peer reviewed. This is a shorter and an updated version.

Then you cannot assert design.
I don't assert it, I infer it.

Every instance of ploidy counts. Try cryptic speciation in Hyla versicolor. New genome produced from an old one, and yes, variation in the diploid number constitutes novelty in genomes.
That's not new functional information. That's just doubled Shannon inforamtion.

It means that if you go far back enough in time every species has shared a common ancestor, more related species have more recent common ancestors than more distantly related ones.
No, it doesn't mean that. You simply assume it. It could be so, but we do not know. Just becasue species can lose the ability to reproduce doesn't mean that some species that were never known to interbreed have ever been able to do so.

Every sequence can produce something functional, you need to show me that the extant functional space in extant organisms couldn't have arrived through mutations, to do this you need to calculate the total number of instances mutagenic events have taken place and the total proportion of functional elements produced using that.
Yes it can produce it. But that's not the point. The point is that simply by duplicating the whole genome you did NOT add new functional information.

You answered nothing with the support of peer reviewed papers or citations.
So what? That's an argument from authority anyway.

Show me the designer and you can directly attribute design as a priveleged explanation.
That would defeat the purpose of design detection.

Shallit and Ellsberry go on to state that whenever chance or law cannot account for all variables yet, design is to be preferred according to Dembski. This is a flawed approach.
Why is it flawed?

Intelligent causes have to exist to contribute to the processes they guide. Mutations exist, your designer has no evidence for existence.
I'm not invoking any specific designer. I'm invoking intelligence. That exists.

Books are not peer reviewed, they do not pass through the stringent editorial policies of scientific journals.
Design Inference was peer reviewed.

That is where the error apparently is, did you not read the fecking paper?
I did read the paper! An I explained why it's wrong. There is no reason to take the exponent of the 10^120. You take it's log function. Why would we take the exponent? Why? I'm listening.

, a shift which renders it useless. It basically says we should infer non-randomness if
we see any event or pattern that is more unlikely than the entire state
of the universe. But clearly, no subset or partial view of the universe
DEMBSKI’S SPECIFIED COMPLEXITY (V0.2) 5
can be more unlikely than the universe as a whole. (This conclusion is
trivial, and can be proven quite easily without going through Dembski’s
elaborate constructions. Simply note that the probability of the rest of
the universe, independent of the part under consideration, cannot be
greater than 1.) Thus it will never be possible to infer non-randomness
by this method.
This is wrong because Dembski does not claim that any pattern is less probable than all the states the universe can ever be in 15 billion years. But in all states that it COULD HAVE BEEN. There is a big difference. Not all possible states can be achived all the time. Only some states can. So the relevant number is all the states that the universe was is for the duration of it's lifetime.

Since they back their calculations up with indpendent lines of analysis, and all you do is just assert. Your approach fails.
as both instances expose the aforementioned error.
And both lines of evidence they present are wrong.
User avatar
Царь Славян
Banned Troll
 
Posts: 294

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#207  Postby GenesForLife » Dec 16, 2010 12:45 pm

And yes, if a thing does a new function or acquires functionality it didn't have before, it qualifies as a gain in function.
And as far as the claim of "already there" is concerned, many of the possible configurations of nucleotide sequences in DNA, generated by mutation, do end up being functional. Mutation produces the necessary sequences and an extension of biochemistry based on the configuration of sequences adds function, which permeates through populations and descendents if added functionality is strongly adaptive.

Mutation doesn't have to produce one sequence correctly every time for functionality to be evolved, it can do so in several ways in an unexplored sequence space, the exploration is due to mutation, and the effects of mutation are deterministic, the combination of deterministic effects however is stochastic. To assume things just get put together by blind chance without any determinism involved is nonsense, and this is the basis of the false dichotomy between chance and natural law, when evolution involves features characteristic of both.

And without demonstrating why you just assert they are wrong.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Ads by Google


Re: Calilasseia - CREATIONISTS-READ THIS

#208  Postby Царь Славян » Dec 16, 2010 12:49 pm

GenesForLife wrote:They also mention this

Dembski is attempting to calculate a kind of uncertainty using something
like Shannon’s formula. To do this, he needs to take the logarithm
of the number of possibilities. But Lloyd’s 10120 bit-operations and 1090
bits are already the logarithms of the number of possible computational
sequences and the number of possible register states respectively
. To
insert either of them inside the logarithm in his formula is a mistake;
what needs to go there is the thing (sequences or states) of which they
are the logarithm.


Do you have a link to Lloyd's original paper?
Of course I do.

http://arxiv.org/abs/quant-ph/0110141
User avatar
Царь Славян
Banned Troll
 
Posts: 294

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#209  Postby Царь Славян » Dec 16, 2010 1:05 pm

Your answer doesn't make sense, please elaborate. What do you mean, results? How about evidence of that limiting mechanism you keep claiming exists, got any?
Simple. If you want to evolve more FSC you need more probabilistic resources, if you want lees, you need less.

Quote directly where anyone claimed humans and horses could interbreed or shut up. This is getting increasingly hilarious...
It is becoming painfully obvious you haven't got a clue about molecular phylogenetics.
If you claim that humans and horses were once something else and were one species then YOU are the one claiming they were able to reproduce.

No it's not. There is no evidence to suggest that all ncDNA should be assumed to have function. Provide evidence to back up your assertion or retract your claim.
And yes I have a better explanation : http://www.talkorigins.org/faqs/molgen/
Read it and educate yourself.
Where did I say that ALL of them have a function?

Wait a minute, above here you just asked why it shouldn't be assumed... but now you admit that it's not? What exactly is your position? Do you even know?
Some of them do, not all.

Neither was it a preplanned event with conscious foresight. None of what you said changed the fact that there was genetic change, resulting in phenotypic change and nature had the opportunity to select upon it. The level of goal-post shifting and obfuscation you employ is pathetic.
It was produced by a mechanims. It doesn't count as a simple random mutation like it does when you have a copying error.

Simply claiming it as fact doesn't make it so. Evidence please.
http://en.wikipedia.org/wiki/Genetic_redundancy

And I'm asking if you have direct empirical evidence that evolution is guided by an intelligent agent. Do you?
In the case of scientists selecting specific individuals it is guided. And that's intelligent design.

Obviously not since we know the flagellum evolved.
Evolution does not imply it was not designed. Plus, we do not know it evolved.
Do you have any evidence of design taking place in the construction of the flagellum? Do you have any evidence that RM+NS could not produce it?
Why should I have evidence they could not produce it? You still have to show me the relevant probabilistic resources that it could have happened.

I'm not evading anything. You are the one ignoring my request for a mechanism that would prevent nature from producing the results achieved in the laboratory.
We first have to agree on a definition. Do you agree that what happened in the lab was a case of intelligent design?

Of course I can. All have have to do cite empirical evidence by experiment. We can simply sit our asses down and see evolution happening. You are under the burden of proof here and the burden is to provide a mechanism that would prevent observed evolution from accumulating over deep time. Put up or shut up.
No, you can not. If you claim that RM + NS can do something, than provide the relevant probabilistic resources that will acount for that.

The evidence shows they evolved. The only way you are getting around that one is by showing evidence that refutes this. Got any?
What evidence? Even if they did evolve, that doesn't mean they were not designed.

Arbitrary calculated limitations are irrelevant when empirical research easily demonstrates the evolution of functional sequences by quite small and mediocre populations of bacteria in laboratory experiments. Please provide a mechanism that would prevent such observed evolution from accumulating over deep time or shut up.
They are not arbitrary, they are estimates observable phenomena. If you don't agree with them, produce a better model. If you don't you can't claim that RM + NS can do anything.

There you go with that idiotic claim again. Evolutionary postulates relating to molecular phylogenetics are not contingent on a claim of possible interspecies reproduction. Why do you keep bringing this hilarious straw-man up?
So common descent is true even if all species are not related?

It is theoretically possible for a shuffling, duplication and frameshift event to take place multiple times at every generation of replication, it's unlikely but possible. You can work from there.
Ok, how many would you say can happen in a given time?

3.5 billion years, minimum.
Great. How many mutation events could have taken place in that time?

The maximum number of mutations in theory is staggering. In your own immune system, B-cells are maturing antibodies at a mutation rate of around one million times the one taking place at reproductive replication.
Additionally your calculation would be meaningless since the number of mutation events says nothing about the type or functionality of the mutation. A frameshift or shuffling event would have many orders of magnitude higher capacity for the generation of novel function than a substitution event.
Well, make an estimate the highest possible estimate.

Because I'm not an idiot I know that your fallacious calculation is structured to argue against the propability of producing a functional protein from scratch with your X number of maximum possible events. If not, then I don't even see how your maximum number of quantum transactions is even relevant. Please elaborate.
It takes into account production of everything, not just proteins.
User avatar
Царь Славян
Banned Troll
 
Posts: 294

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#210  Postby Царь Славян » Dec 16, 2010 1:16 pm

Well, so the authors of the paper you rubbished were right in saying that Lloyd did use logarithms already for information equivalent.
Yes, in getting bits for all the possible registers. And after that when you perform the operations you take the log function again to se how many bits of information you generated.

Care to explain why Lloyd is tentative wrt his conclusions, firstly, and secondly, establish that the above models apply to Newtonian scale physics instead of just models that are postulated to be quantum computers, and also account for the highlighted bit?
The models that would account for the Newtonian physics would give you LESS not more probabilistic resources. Becasue the number of elementary particles is MORE than the number of atoms. If we estimate the number of particles to be 10^90 like Lloyd did, then the number of the smalles unit of matter that is above the quantum level is less than that since atoms consist of particles. And their estimate is 10^80. So you would not have 10^120 states, but 10^110.

Two approximate calculations give the number of atoms in the observable universe to be close to 1080.
http://en.wikipedia.org/wiki/Observable_universe
User avatar
Царь Славян
Banned Troll
 
Posts: 294

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#211  Postby Царь Славян » Dec 16, 2010 1:19 pm

GenesForLife wrote:And yes, if a thing does a new function or acquires functionality it didn't have before, it qualifies as a gain in function.
And as far as the claim of "already there" is concerned, many of the possible configurations of nucleotide sequences in DNA, generated by mutation, do end up being functional. Mutation produces the necessary sequences and an extension of biochemistry based on the configuration of sequences adds function, which permeates through populations and descendents if added functionality is strongly adaptive.

Mutation doesn't have to produce one sequence correctly every time for functionality to be evolved, it can do so in several ways in an unexplored sequence space, the exploration is due to mutation, and the effects of mutation are deterministic, the combination of deterministic effects however is stochastic. To assume things just get put together by blind chance without any determinism involved is nonsense, and this is the basis of the false dichotomy between chance and natural law, when evolution involves features characteristic of both.

And without demonstrating why you just assert they are wrong.
Nobody is arguing with that. It can happen. I'm simply asking you to provide me with relevant probabilistic resources, so we can measure what evolution can do. If you don't then you can't measure what evolutionc can do. Neither can you say that it produced all the diversity of life we see in living organisms. Because we do not know if it has the relevant resources to do so.
User avatar
Царь Славян
Banned Troll
 
Posts: 294

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#212  Postby GenesForLife » Dec 16, 2010 1:42 pm

Царь Славян wrote:
Errant nonsense, do you know what it takes to convert ncDNA to a protein coding sequence, a point mutation that converts a codon to a start codon. Again you make the fundamental error of asserting that all ncDNA is regulatory, aka promoters. I can recall miRNA, shRNA and general transcripts as ncDNA that isn't involved in regulating exons, as you assert it does.



The gaining of the exon is the new function, fucking learn this lesson.
You keep putting words in my mouth. I never sad that all of them regulate gene expression. But anyway, even if this was a product of RM + NS, how much FSC was produced?


When you say ncDNA regulates gene function you must be more careful as to specifying which type.


Bits MUST be Shannon information?
No they do not. How is HDD storage calculated? Do you know? Show me the equation.


This was precisely your implication, however, when you asked if HDD storage was not calculated in bits when I said HDDs aren't a case of Shannon information. The formula 512 bytes x number of sectors on the hard drive. 1 byte = 8 bits.


Stochastic combinations of known natural processes, shove this strawman back up your rear.
So the Rosetta stone is a product of stochastic processes?


Is evolution a Rosetta stone?

But they have traced this happening, game fucking over. Besides, there is no extant evidence for genetic redundancy in the protein they evolved in the bacteriophage they used, if you have evidence for such genetic redundancy, present it.
Traced it how? They saw a step by step process of gain in infectivity, and you conclude that it's cause by RM + NS. Where's the evidence?


Because the mutagenic processes they used are statistically indeterminate, secondly the prevalence of the mutant allele that caused gain of functional infectivity increased exactly as natural selection would predict it.

I don't need to. The burden of proof is on you.
I already did. If you do not agree with it, provide your own. You can't say that evolution in nautre can produce even one single bit of information if you do not show that it has the relevant probabilistic resources.


Probability does not dictate what has already happen.

And I showed Adami et al provided another approach, and it was published in a better journal and backed up by simuations as opposed to defining wibble into thin air.
They simply used Shannon information! LOL!


And Abel just pulled out a concept from his arse intended to create a niche where natural processes could not operate, L-O-fucking-L , next.

I will define things as I want to blah blah blah.
So let me get this straight, input of information by an intelligent agent is not by definition intelligent design?


Aided design. Since the actual devices of the devices involved happened a priori The fact that I go to a shop and choose does not make the design process of that product intelligent.

Still persisting with a fucking strawman, I see.
I don't see a strawman anywhere.


Blind assertion again.

The information to produce proteins hasn't been there, it gained the information to produce proteins, functionality gained, game over.
OK, let's say that it has gained new functional information. By what mechanism has it does so? And how much FSC was gained?


The mechanism here is a point mutation that opened a reading frame (created a start codon by substituting a base for another base) As far as FSC is concerned, the method for calculating it is included here, go ahead and calculate it for yourself.

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2217542/


And your peer reviewed citation supporting this flatulent assertion is?

Random = statistically indeterminate.
Mutation = heritable change in DNA
That is fucking it.
It's common sense. Random mutations are copying errors. Transposons are mechanisms that cause muations.


Do you have a source for that definition? Copying errors are ONE TYPE of Random mutation, what you are confusing RM with is Point mutations. Copying errors also cause mutations, transposons cause mutations, gene duplication causes mutations, retroposition causes mutations, but they all cause mutations that are random, random doesn't mean uncaused, it means statistically indeterminate.

You cannot "infer" that, put the evidence up or shut the fuck up.
How can you infer that it was RM + NS then?


And the point of this obfuscatory piece of belly button fluff is?

You have a model that you proposed, now fucking justify it.
How exactly do you want me to justify it?


Get it published in a peer reviewed journal, there are plenty that are open access/free for publication, and use the model to demonstrate as an accepted conclusion that natural evolutionary processes cannot account for extant biodiversity.

It separates gene pools into descendent species, it means that related species which are related by descent no longer need to be able to reproduce together, as your strawman claimed they need to for them to be related, fucking own goal there.
No, I said that at some point in time while humans were not humans an horses were not horses, but something else, and were related, they were supposed to be able to reproduce.


This is a blatant lie, you clearly stated they cannot reproduce.

This is true for species that we know are related. We know that some species can reproduce. People and horses can't, so their similarity does not point to common descent. If they were never able to reproduce then their similarity obviously does not imply common descent.


there, blatant lying, not only that but you set up another strawman where you assert that Common descent should imply that the two shared the most recent common ancestor, otherwise you wouldn't make the aforementioned statement.

Still no evidence, put up or shut the fuck up.
Are you saying genetic redundancy is not real?


Nonsense, I am asking you to demonstrate that Hayashi et al's evolution was a case of genetic redundancy. Don't obfuscate.

And your evidence to support this claim is?
Observation! If something is regained step by stpe then it should be visible in each and every step! It didn't become invisible all of a sudden.


Empty faffing about. Get the data from the paper and show why it is a case of redundancy, that is all. Also support that your "observation" is valid by means of references to peer reviewed literature.

Only if it was observed being designed.
Which means that your logic is invalid and stops science. By my logic we could infer design.


By your logic we could also posit intelligent smurfs moving atoms around. Which makes no testable predictions and is unfalsifiable, and THAT is not science.

Your lack of comprehension is not my problem.
You mistook me for yourself.


Other than an ad hom this piece of rhetorical arse-gravy is supposed to mean?

There is evidence of existing entities, such as humans, making such things as Rosetta stones, there is no evidence for designers macromanaging biodiversity. Fucking learn this.
LOL, didn't you even read my quote till the end? I already answered that objection. Here, read it again, slowly...


Another empty claim.

I'm being general. By your logic we couldn't infer that the Rosetta stone is designed, becasue we have no evidence of people making Rosetta stones. Which is pretty much a logical fallacy. Yeah you could then turn around and say that we do have experience of people making inscriptions in stone. But that would just be a more general description. And that would be exactly what I did. I don't have to have an experience of someone tweaking the genome to infer that some intelligence in general could have done it. Because I know what intelligence can do in general.


I also know that intelligence requires an existing source. Your vapid assertion counts for nothing.

And how does taking stats for the sum total of matter address the issue of living organisms rigorously?
Becasue living organisms are a subset of all the matter on Earth. And if this matter can produce more than X bits of information than living organism can't also.


Are you sure that is what you want to say, or do you want to say if matter cannot produce more than X bits the subset of living organisms cannot? Here is a hint, information is just a representation of physical states, not states themselves, all your argument boils down to is this "if describing matter on earth takes X bits, describing matter belonging to living things will be less than X" , but for this you must establish that there is functional equivalence between configurations in living matter and simple binary interactions using quantum mechanics.

We know mutations occur, we know they can produce new DNA from pre-existing templates, starting with template based replication in spontaneously forming RNA chains. The quasar example shoots this assertion down, fucking repeating this when the argument erroneously attributes natural things such as quasars to design by default is downright inane.
Yeah, but we have to show that such meachanism have the probablistic resorces to produce a given event. If they can't then we should infer design.


If they can't we leave it as an open question, not shoehorn mythological entities in..

In other words you assert design regardless of whether the evidence for natural evolution is present or not, as I said, asserting a conclusion where evidence doesn't matter.
No. I'm simply saying that one does not rule out the other. They are not mutually exclusive.


Which as a consequence allows you to assert design if it evolved, and assert design even when it didn't, design becomes a one-answer-fits-all-data one.

Snowflakes form due to the interaction of multiple physical and chemical properties in stochastic proportions, just as mutagenic processes act on genomes in stochastic proportions, your assertion fails.
Wrong. We have the equation for crystalization. We know how they form. It's not a stochastic process it's a natural law. We have no equation for biological information.


Again, is this what you were trying to say? There are equations used to calculate biological information, including your very own Abel's work. Biological information again is just the representation of the physical states of organisms.

I told you what the limits of the mutagenic processes themselves are per instance of mutation, with no change in genome length with point mutations, and exponential increases in cases of whole genome duplications per instance.
I asked you about probabilistic resources.


How do you calculate probabilities for a strictly stochastic class of events in the absence of complete data?

A description is not an analogy, they say they are molecular machines , they don't say they're just like a molecular machine.
Oh, so they are machines. Well, all machines we do know of have been designed. So molecular machines have also been designed, right?


All designers we know of are humans, therefore your designer must be human, right? Dodgy conclusions, quelle surprise!

And I have no reason to accept it until you've modelled it to evolution and published the findings, I stick with the scientific consensus.
Yeah, that's not gonna work. This is a forum, and here we go with what we have. If you don't like it, tough luck. You lost.


And I am not just a forumite, and I go through the scientific consensus , and you can screw yourself in delirium with all the delight you want.

The probability is 1, now bugger off.
Show me the calculation.


The probability of events that have already happened is ____________________

Because the nature of development introduces natural variables due to imperfection that computer manufacture usually does not.
Does that mean that such a mechanism was not designed?


It renders computers as immodelable substitutes for living things.

It increases sequences in the genomes
It frees things up for further mutation
mutation generates functional complexity.
No, not functional complecity, but Shannon complexity.


And you have peer reviewed citations to show that mutations cannot produce functional complexity?
Also note, you have asserted mutations can produce no functional complexity here.

Yup, PNAS has higher standards of rigour and professional standing, besides, I haven't seen Abel's paper being further endorsed by the scientific community, peer review is just the minimum standard. If you want examples of why it is just a minimum standard, have a look at Lynch et cetera who argue for stochasticity in developmental networks and Coyne argues for selection shaping the same, different viewpoints that have both passed peer review but obviously two that cannot be both simultaneously correct.
I don't care where it's published. It' either correct or not.


And I care about sources passing through critical scrutiny first. End of, and I am not going to trust acolytes who refused to turn up at Dover or did so and had their arses handed over to them.

The verdict is they died, and of course if they were murdered there would be evidence of causal entities murdering those people.
It says MURDERED n the quote I posted.


Again because known entities (aka humans) have been shown to indulge in murder, leaving behind telltale signs that indicate murder without indicating which human being did it. Provide evidence for your designer AND mechanisms and the analogy will become valid.

They know that there was an unknown assailant because the assailant used a known method that is characteristic of known entities (humans) and had been observed before with other humans using similar contraptions to commit murder. Now your designer will have to be shown to exist and using mechanisms of design before design can be inferred.
Intelligence is a known cause. Genetic engineering is a known method.


Genetic engineering by humans that exist and mechanisms that are exist is a known method.

You can derive the fact that new genomic content is produced using known evolutionary processes.
Show me the equation for that.


What the fuck, do you think anything derivable has an equation? Genome duplication adds new sequences into a genome, ergo produces new genomic content.

If the genome is the sum total of all DNA in an organism, and duplication occurs, there is a change in what constitutes the genome, thus producing new genomic information.

Not a peer reviewed paper, if it passes it will be minimally credible.
Design Inference was peer reviewed. This is a shorter and an updated version.


Present the peer reviewed version.

Then you cannot assert design.
I don't assert it, I infer it.


And I reject the premise of inference.

Every instance of ploidy counts. Try cryptic speciation in Hyla versicolor. New genome produced from an old one, and yes, variation in the diploid number constitutes novelty in genomes.
That's not new functional information. That's just doubled Shannon inforamtion.


Shannon information allows functional information to be generated easily just by further mutation acting on the increased information subset. This has been demonstrated ONLY for you to lie, deny, obfuscate, and faff about.

It means that if you go far back enough in time every species has shared a common ancestor, more related species have more recent common ancestors than more distantly related ones.
No, it doesn't mean that. You simply assume it. It could be so, but we do not know. Just becasue species can lose the ability to reproduce doesn't mean that some species that were never known to interbreed have ever been able to do so.


Your denial changes nothing. Again, who said anything about interbreeding. We aren't talking of horses being something else and humans being something else, we are talking of the lineages that led to humans and horses having diverged from a common ancestor, nowhere does common descent postulate that all extant organisms have an MRCA.

Every sequence can produce something functional, you need to show me that the extant functional space in extant organisms couldn't have arrived through mutations, to do this you need to calculate the total number of instances mutagenic events have taken place and the total proportion of functional elements produced using that.
Yes it can produce it. But that's not the point. The point is that simply by duplicating the whole genome you did NOT add new functional information.


You added enough material for more to develop, this natural process explains the need for new raw material before mutation can add functionality.

You answered nothing with the support of peer reviewed papers or citations.
So what? That's an argument from authority anyway.


I cannot present the requisite evidence so blah blah I'll call it an argument from authority and continue to assert that my assertions are correct therefore you lost LOL blah. Yeah fucking right.

Show me the designer and you can directly attribute design as a priveleged explanation.
That would defeat the purpose of design detection.


And eliminate the need for flawed methods such as inference.


Shallit and Ellsberry go on to state that whenever chance or law cannot account for all variables yet, design is to be preferred according to Dembski. This is a flawed approach.
Why is it flawed?


Still pretending, haven't read the paper, ey?

Intelligent causes have to exist to contribute to the processes they guide. Mutations exist, your designer has no evidence for existence.
I'm not invoking any specific designer. I'm invoking intelligence. That exists.


You have no evidence for design existing without designers either.

Books are not peer reviewed, they do not pass through the stringent editorial policies of scientific journals.
Design Inference was peer reviewed.


And the citation confirming this is?

That is where the error apparently is, did you not read the fecking paper?
I did read the paper! An I explained why it's wrong. There is no reason to take the exponent of the 10^120. You take it's log function. Why would we take the exponent? Why? I'm listening.


Apparently the contention is that 10^120 is already a logarithmic value, you don't have to take it again, did you read that paper?

, a shift which renders it useless. It basically says we should infer non-randomness if
we see any event or pattern that is more unlikely than the entire state
of the universe. But clearly, no subset or partial view of the universe
DEMBSKI’S SPECIFIED COMPLEXITY (V0.2) 5
can be more unlikely than the universe as a whole. (This conclusion is
trivial, and can be proven quite easily without going through Dembski’s
elaborate constructions. Simply note that the probability of the rest of
the universe, independent of the part under consideration, cannot be
greater than 1.) Thus it will never be possible to infer non-randomness
by this method.
This is wrong because Dembski does not claim that any pattern is less probable than all the states the universe can ever be in 15 billion years. But in all states that it COULD HAVE BEEN. There is a big difference. Not all possible states can be achived all the time. Only some states can. So the relevant number is all the states that the universe was is for the duration of it's lifetime.

Since they back their calculations up with indpendent lines of analysis, and all you do is just assert. Your approach fails.Both instances expose the aforementioned error.
And both lines of evidence they present are wrong.
[/quote]

Banal assertion. Multiple failures of a model to fit data from independent lines of thought mean that the model is flawed, and not the independent lines. Also learn that mathematics is axiomatic deductive, it does not dictate the physical world.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#213  Postby GenesForLife » Dec 16, 2010 2:31 pm

Царь Славян wrote:
GenesForLife wrote:And yes, if a thing does a new function or acquires functionality it didn't have before, it qualifies as a gain in function.
And as far as the claim of "already there" is concerned, many of the possible configurations of nucleotide sequences in DNA, generated by mutation, do end up being functional. Mutation produces the necessary sequences and an extension of biochemistry based on the configuration of sequences adds function, which permeates through populations and descendents if added functionality is strongly adaptive.

Mutation doesn't have to produce one sequence correctly every time for functionality to be evolved, it can do so in several ways in an unexplored sequence space, the exploration is due to mutation, and the effects of mutation are deterministic, the combination of deterministic effects however is stochastic. To assume things just get put together by blind chance without any determinism involved is nonsense, and this is the basis of the false dichotomy between chance and natural law, when evolution involves features characteristic of both.

And without demonstrating why you just assert they are wrong.
Nobody is arguing with that. It can happen. I'm simply asking you to provide me with relevant probabilistic resources, so we can measure what evolution can do. If you don't then you can't measure what evolutionc can do. Neither can you say that it produced all the diversity of life we see in living organisms. Because we do not know if it has the relevant resources to do so.


And if we cannot calculate the probabilistic data set because we don't have rigorous data for all of evolution we cannot apply probabilistic analysis. In the absence of complete data probabilistic estimates cannot be used to inform conclusions.

And if you notice what the ToE postulates, it is that evolutionary processes are capable of producing complexity, this is tentatively accepted as any other is in science, if other explanatory processes are found to exist, as in direct empirical evidence for design the scientific consensus will adapt to it.

Specious ex-recto assertions do not count for such empirical evidence, that is it.

Also note that peer-review does not constitute an argument from authority, it constitutes having passed through a minimal set of rigour specific to scientific discourse.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#214  Postby GenesForLife » Dec 16, 2010 2:42 pm

I also think your use of Abel's FSC is misapplied, here.

A mathematical measure of functional information, in units of Fits, of the functional sequence complexity observed in protein family biosequences has been designed and evaluated. This measure has been applied to diverse protein families to obtain estimates of their FSC. The Fit values we calculated ranged from 0, which describes no functional sequence complexity, to as high as 2,400 that described the transition to functional complexity. This method successfully distinguishes between FSC and OSC, RSC, thus, distinguishing between order, randomness, and biological function.


All they are doing is assigning site-dependent values to achieve functionality in a given protein family by sequence alignment in phylogenetically grouped protein families containing entries that were computationally aligned.

http://www.tbiomed.com/content/4/1/47

Abel has not put forth a method to calculate FSC for de-novo gene origination, if you have papers describing how to do this, present it, otherwise FSC calculation cannot be used in the case of the antifreeze glycoproteins arising from ncDNA, because a rigorous model to estimate it for datasets like that is absent.

Also note that Abel has only used it to compare residues in similar proteins, to assign functionality values to residues contained therein, he has not provided methods to compare a class of new biomolecules. Since a wholly new protein sequence is produced when ncDNA is converted to coding DNA there are no usable methods to derive FSC.

Your use of this parameter is unwarranted.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#215  Postby Rumraket » Dec 16, 2010 3:41 pm

Царь Славян wrote:
Rumraket wrote:Your answer doesn't make sense, please elaborate. What do you mean, results? How about evidence of that limiting mechanism you keep claiming exists, got any?
Simple. If you want to evolve more FSC you need more probabilistic resources, if you want lees, you need less.

Argument from bare assertion, again. I'm not surpised. By the way, duplication events provide genetic material for mutations to accumulate. Please provide evidence of a mechanism that would prevent this.

Quote directly where anyone claimed humans and horses could interbreed or shut up. This is getting increasingly hilarious...
It is becoming painfully obvious you haven't got a clue about molecular phylogenetics.
If you claim that humans and horses were once something else and were one species then YOU are the one claiming they were able to reproduce.

Which is directly contrary to the claim that humans and horses could reproduce, since if they were one and the same species, they would be neither human nor horse. In other words, you have constructed and now set fire to your own strawman.
Congratulatios.

No it's not. There is no evidence to suggest that all ncDNA should be assumed to have function. Provide evidence to back up your assertion or retract your claim.
And yes I have a better explanation : http://www.talkorigins.org/faqs/molgen/
Read it and educate yourself.
Where did I say that ALL of them have a function?

Where you pretty much assumed it by asking why not, that's where. You even claimed it was the best explanation for ncDNA. It seems you are retracting that claim now. I guess I was right and you really don't know what you think after all.

Wait a minute, above here you just asked why it shouldn't be assumed... but now you admit that it's not? What exactly is your position? Do you even know?
Some of them do, not all.

In other words, your "best explanation" claim was just that, a claim. An argument from bare assertion. Everyone is awestruck by their suprise :o :o :o

Not.

Neither was it a preplanned event with conscious foresight. None of what you said changed the fact that there was genetic change, resulting in phenotypic change and nature had the opportunity to select upon it. The level of goal-post shifting and obfuscation you employ is pathetic.
It was produced by a mechanims. It doesn't count as a simple random mutation like it does when you have a copying error.

The mechanism is stochastic and therefore qualifies as random genetic change. Your endless wibbling about it is not going to change that fact.

Simply claiming it as fact doesn't make it so. Evidence please.
http://en.wikipedia.org/wiki/Genetic_redundancy

I think you lost track of the discussion mate. Here is the original contention :

Царь Славян wrote:
GenesForLife wrote:Blind assertion, show me where in the paper it has been identified as a case of genetic redundancy? this is a blatant falsehood.
I identified it in that way.

So, once again, please cite where in the paper it is identified as a case of genetic redundancy. Linking a random wikipedia article on genetic redundancy is... redundant and irrelevant.
Here is the original paper, again : http://www.plosone.org/article/info:doi ... ne.0000096
Put up or shut up.
And stop linking redundant and irrelevant wikipedia articles.

And I'm asking if you have direct empirical evidence that evolution is guided by an intelligent agent. Do you?
In the case of scientists selecting specific individuals it is guided. And that's intelligent design.

When the scientists aren't doing any designing, only selecting naturally occurring mutants it hardly qualifies as design. Provide evidence of a mechanism that would prevent said selection from taking place in the wild or shut up.

Obviously not since we know the flagellum evolved.
Evolution does not imply it was not designed. Plus, we do not know it evolved.
Do you have any evidence of design taking place in the construction of the flagellum? Do you have any evidence that RM+NS could not produce it?
Why should I have evidence they could not produce it? You still have to show me the relevant probabilistic resources that it could have happened.

No I don't. All I have to do is show you that RM+NS can produce increased fitness and novel function. This has been done. The burden of proof is on you now. Good luck. A nobel prize awaits you.

I'm not evading anything. You are the one ignoring my request for a mechanism that would prevent nature from producing the results achieved in the laboratory.
We first have to agree on a definition. Do you agree that what happened in the lab was a case of intelligent design?

No. Artificial selection does not equal intelligent design, nor does it argue a demand for intelligent design in the selection for functional mutants, especially when selection in nature have been observationally demonstrated. Once again, the burden of proof is on you.. that's how science works. A mechanism has been demonstrated to exist and function in the laboratory and in nature, it is now up to you to demonstrate by experiment and observation how this is wrong.

Argument from bare assertion or spurious probability calculation is worthless when faced with observational reality. Get over it.

Of course I can. All have have to do cite empirical evidence by experiment. We can simply sit our asses down and see evolution happening. You are under the burden of proof here and the burden is to provide a mechanism that would prevent observed evolution from accumulating over deep time. Put up or shut up.
No, you can not. If you claim that RM + NS can do something, than provide the relevant probabilistic resources that will acount for that.

Once again, I don't have to engage in your ludicrous probability calculations. We have experimental and observational evidence of RM+NS being able to produce increased fitness, novel function and speciation. The burden of proof is on your. Please provide evidence for a mechanism that would prevent observed evolution from accumulating over deep time.

The evidence shows they evolved. The only way you are getting around that one is by showing evidence that refutes this. Got any?
What evidence? Even if they did evolve, that doesn't mean they were not designed.

By definition they were not designed if they evolved. Artificial selection demonstrates the power of selection on naturally occurring mutations. In order to refute this you must provide evidence for a mechanism that would prevent observed evolution from accumulating over deep time.

Arbitrary calculated limitations are irrelevant when empirical research easily demonstrates the evolution of functional sequences by quite small and mediocre populations of bacteria in laboratory experiments. Please provide a mechanism that would prevent such observed evolution from accumulating over deep time or shut up.
They are not arbitrary, they are estimates observable phenomena. If you don't agree with them, produce a better model. If you don't you can't claim that RM + NS can do anything.

I can not only claim that RM+NS can do it, I can show it by reference to peer reviewed litterature. Experimental and observed evolution. Stop crying about your useless probability models. Stop sleeping. Get out of dreamland and Wake the fuck up to reality.

There you go with that idiotic claim again. Evolutionary postulates relating to molecular phylogenetics are not contingent on a claim of possible interspecies reproduction. Why do you keep bringing this hilarious straw-man up?
So common descent is true even if all species are not related?

Obviously not, and that doesn't even follow from what I said. Are you being intentionally thick again? What is it with you presuppositionalists and your persistent resistance to education? It boggles the mind.

It is theoretically possible for a shuffling, duplication and frameshift event to take place multiple times at every generation of replication, it's unlikely but possible. You can work from there.
Ok, how many would you say can happen in a given time?

Your question is meaningless since it would require knowledge of the average mutation rates of now extinct organisms.
How about you go do some reasearch instead of asking these inane questions? Try googling E. coli doubling time, and searching for frequency of duplication events and other mutations in the peer reviewed litterature for E. coli. It's out there. You can practice your math on that to start with. Have fun.

3.5 billion years, minimum.
Great. How many mutation events could have taken place in that time?

Again this question is meaningless since it would require knowledge of the average mutation rates and doubling times of now extinct life. In other words you are getting nowhere.

The maximum number of mutations in theory is staggering. In your own immune system, B-cells are maturing antibodies at a mutation rate of around one million times the one taking place at reproductive replication.
Additionally your calculation would be meaningless since the number of mutation events says nothing about the type or functionality of the mutation. A frameshift or shuffling event would have many orders of magnitude higher capacity for the generation of novel function than a substitution event.
Well, make an estimate the highest possible estimate.

Go do some google-scholar searches.

Because I'm not an idiot I know that your fallacious calculation is structured to argue against the propability of producing a functional protein from scratch with your X number of maximum possible events. If not, then I don't even see how your maximum number of quantum transactions is even relevant. Please elaborate.
It takes into account production of everything, not just proteins.

So basically I was right.
Half-Life 3 - I want to believe
User avatar
Rumraket
 
Posts: 13146
Age: 38

Print view this post

Ads by Google


Re: Calilasseia - CREATIONISTS-READ THIS

#216  Postby GenesForLife » Dec 16, 2010 4:04 pm

Also, I need a paper that describes how to calculate FSC for de novo gene origination, with demonstrated case studies.
Even the current model, just applied to aligned homologues, in the words of Abel himself, is imperfect.

In this paper, we have presented an important advance in the measurement of the FSC of biopolymers. It was assumed that aligned sequences from the same Pfam family, could be assigned the same functionality label. Even though the same functionality may not be applicable to individual sites, site independence and significance was assumed and the measured FSC of each site was summed. However, further extension of the method should be considered [12,35]. For example, if dependency of joint occurrences is detected between the outcomes of two variables X3 and X4 in the aligned sequences, then the N-tuple representation of the sequences could be transformed into a new R-tuple YR where these outcomes of X3 and X4 are represented as outcome by a single variable Y3 as shown in Figure 6. An outcome of the two variables in X3 and X4 correspond to a hypercell in YR. A more accurate estimate of FSC could then be calculated. We are currently considering this more general scenario.


Of course, he also goes on to note that Fits (Functional bits) are a function of evolutionary conservation, and can only differentiate between those residues that contribute to function within a protein at a site in the protein as opposed to other residues in homologues at other sites. In effect one can use it to analyse residues in it using phylogenetic aligning to identify which mutations at which sites are relevant to a given function in a given protein if that is preserved. Abel's methods cannot be applied to co-optation or modification or de novo gene origination.

And functional relevance of a particular residue on account of conservation does not imply design, it implies strong positive selection.

Care to use some other position that is not a misrepresentation of what is actually postulated anytime soon?

They also note.

The measurement in Fits of the FSC provides significant information about how specific each monomer in the sequence must be to provide the needed/normal biofunction. The functional information measures the degree of challenge involved in searching the sequence space for a sequence capable of performing the function.


In other words, all it does is associate conserved sequences with functionality, and sequences can be conserved by selection once you climb up a slope because further mutation in conserved loci will be selected against since they are suboptimal to the optimal sequence.

Dembski's methods haven't been empirically tested and the results demonstrated in peer-review, the assertions you make using Abel's work do not hold water strictly due to methodological constraints.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#217  Postby Rumraket » Dec 16, 2010 4:19 pm

It's interesting to note how often Abel's papers are cited by ID-creationist proponents. I wonder how he feels about that.
Half-Life 3 - I want to believe
User avatar
Rumraket
 
Posts: 13146
Age: 38

Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#218  Postby iamthereforeithink » Dec 16, 2010 4:22 pm

:popcorn:

I hope you gentlemen will allow me to bookmark this thread. I'm just an observer. I don't intend to contribute anything.
“The supreme art of war is to subdue the enemy without fighting.” ― Sun Tzu, The Art of War
User avatar
iamthereforeithink
 
Posts: 3332
Age: 9
Male

Country: USA/ EU
European Union (eur)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#219  Postby GenesForLife » Dec 16, 2010 4:41 pm

Rumraket wrote:It's interesting to note how often Abel's papers are cited by ID-creationist proponents. I wonder how he feels about that.


I also noticed one more thing, FSC is calculated for the monomers of a given protein in a given organism for a given function, since duplication of the genes that produce those proteins are free from conservation, and are redundant, and not required for further function, they are free from FSC for that function, and free to mutate to accrue more functions.

Once this happens it can start being selected for for other functions, and whenever a substitution is fixed due to functional importance the FSC of that fixed residue will increase, and be selected for, the next fixation will add more to the whole value of the newly functional protein. He has not specified anything for cross-polymer comparisons going from nucleic acid to protein.

The only reason Abel applied FSC to show abiogenesis that took place with a template couldn't be explained by chance or necessity alone was that Darwinian selection, which fixes residues and sequences, is not applicable to systems until replication ensues.

It appears to me that the attempt to shoehorn that definition to evolution does not work.

He also has this to say.

Functional Sequence Complexity requires this added programming dimension of uncoerced selection at successive decision nodes in the string. Shannon information theory measures the relative degrees of RSC and OSC. Shannon information theory cannot measure FSC. FSC is invariably associated with all forms of complex biofunction, including biochemical pathways, cycles, positive and negative feedback regulation, and homeostatic metabolism. The algorithmic programming of FSC, not merely its aperiodicity, accounts for biological organization. No empirical evidence exists of either RSC of OSC ever having produced a single instance of sophisticated biological organization. Organization invariably manifests FSC rather than successive random events (RSC) or low-informational self-ordering phenomena (OSC).


http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1208958/

The functional sequence space allows for uncoerced selection.
Each site in a given amino acid sequence is one where a substitution can take place.
The decision is made by natural selection and fixation as to whether a fixed residue is conserved or not because of functional acquisition. Which is precisely what Abel did in his paper when calculating residues.

I also note Abel has developed no statistical or mathematical or computational biological methods for evaluating FSC in the other instances he mentions, but given how it is calculated it is simply a property of sequences.

raw sequence + mutation ---------> multiple solutions in sequence space -------> fixation and addition of FSC at conserved locations. Within the remit of mutagenic processes the capacity to gain FCT is present.

The conclusion is that organized biochemical pathways et cetera require functional sequences, and that functional sequences carry functional complexity which can be used to describe functionality, this says nothing about whether evolution can produce them or not.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

Re: Calilasseia - CREATIONISTS-READ THIS

#220  Postby GenesForLife » Dec 16, 2010 5:13 pm

Coming to the journal itself, I quote.

Biology has a conceptual basis that allows one to build models and theorize across many life sciences, including medicine and medically-related disciplines. A dearth of good venues for publication has been perceived during a period when bioinformatics, systems analysis and biomathematics are burgeoning.

Steps have been taken to provide the sort of journal with a quick turnaround time for manuscripts which is online and freely accessible to all readers, whatever their persuasion or discipline. We have now been running for some time a journal which has had many good papers presented pre-launch, and a steady stream of papers thereafter. The value of this journal as a new venue has already been vindicated.

Within a short space of time, we have founded a state-of-the-art electronic journal freely accessible to all in a much sort-after interdisciplinary field that will be of benefit to the thinking life scientist, which must include medically qualified doctors as well as scientists who prefer to build their new hypotheses on basic principles and sound concepts underpinning biology. At the same time, these principles are not sacrosanct and require critical analysis. The journal http://www.tbiomed.com webcite promises to deliver many exciting ideas in the future.


http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1208958/

Sort after?

There is no evidence presented for ID proper in this paper, the work presented in the other paper needs to be refined by the authors' own words, and in neither of those papers have they used their data to show that natural processes cannot account for FSC at all, in this paper they invoke an "upper probability bound" which is by itself flawed and does not account for selection pressure.

It also appears that nobody has picked up on their work thereon other than IDists who've cited the paper on their own sites.
And their work is only predicated on sequence alignment in protein families based on the protein family database, which rules out, as I said before, duplication and relaxation of functional constraints.
GenesForLife
 
Posts: 2920
Age: 30
Male

United Kingdom (uk)
Print view this post

PreviousNext

Return to Creationism

Who is online

Users viewing this topic: No registered users and 0 guests