River Jordan
Active Member
- Jan 30, 2014
- 1,856
- 50
- 48
Wormwood,
From Meyer: "So what kind of information does DNA contain, Shannon information or specified information? Mere complexity or specified complexity? The answer is— both. First, DNA certainly does have a quantifiable amount of information-carrying capacity as measured by Shannon’s theory. Since DNA contains the assembly instructions for building proteins, the gene-expression system of the cell functions as a communication channel. Further, the nucleotide bases function as alphabetic characters within that system. This enables scientists to calculate the information-carrying capacity of DNA using Shannon’s equations . Since, at any given site along the DNA backbone, any one of four nucleotide bases may occur with equal ease, the probability of the occurrence of a specific nucleotide at that site equals 1/ 4. For the occurrence of two particular nucleotide bases, the odds are 1/ 4 × 1/ 4. For three , 1/ 4 × 1/ 4 × 1/ 4, or 1/ 64, or (1/ 4), 3 and so on. 31 The information -carrying capacity of a sequence of a specific length n can then be calculated using Shannon’s familiar expression (I =–log2p) once one computes a probability value (p) for the occurrence of a particular sequence n nucleotides long where p = (1/ 4) n. The p value thus yields a corresponding measure of information-carrying capacity or syntactic information for a sequence of n nucleotide bases. Just as mathematicians and engineers can apply Shannon’s theory to analyze a written text, a cryptographic transmission, or a section of software, mathematical biologists can apply the theory to analyze the information-carrying capacity of a DNA, RNA, or protein molecule."
All that is is a description of how to measure the information capacity of DNA (or any other 4 character based system). But that's not the question at hand, is it? I'm not asking "What is the informational capacity of a genome"; I'm asking "What is genetic information and how are you measuring it". Those are very different questions.
From Meyer: "As I write this sentence, the placement of each additional letter eliminates twenty-five other possible letters and a corresponding amount of uncertainty. It, therefore, increases the information of the sentence by a quantifiable amount as measured by Shannon’s theory . Similarly, at each site along the DNA molecule any one of the four bases is possible. Thus, the placement or presence of any one of the bases eliminates uncertainty and conveys a quantifiable amount of information according to Shannon’s theory."
Now, this would seem to say that "genetic information" = nucleotides, and the addition of any nucleotides to a sequences is "new genetic information". But if that's the case, then all his subsequent arguments about evolution not being able to increase the amount of genetic information are trivially easy to disprove. We see mutations adding nucleotides to sequences every single day. You, I, and everyone here was born with 100-200 mutations that weren't present in our parents. So you and Meyer are left with a choice....stick with this and admit the core creationist arguments are wrong, or move the goalposts. Let's see what Meyer does....
Meyer: "The sequences of nucleotide bases in DNA and the sequences of amino acids in proteins are highly improbable and, therefore, have large information-carrying capacities. Knowing this, some scientists have mistakenly described DNA and proteins as if they contained only Shannon information or possessed mere information-carrying capacity."
Ah, now it's something different! That's a good thing (for him) because as we've seen, relying merely on Shannon Theory means a first year undergrad can easily disprove his arguments. So what is the actual definition of "genetic information" and a method for measuring it? Um.......dunno. :wacko:
The best we have is from pg. 86 where he quotes the following, "an arrangement or string of characters, specifically one that accomplishes a particular outcome or performs a communication function". So is that the definition? That definition isn't tied to probabilities or just any string of letters, but instead relies on function. If so, once again any first year undergrad should be able to shoot this down, as we routinely see evolutionary mechanisms produce new genetic sequences that result in different functions. Also, that brings up "the onion test" question. Basically, the domestic onion's genome is about five times larger than the human genome. Does that mean onions have five times more information than humans? Hmmmm......
So again the question remains....what exactly is the creationist definition of "genetic information" and how are they measuring it? Specifically, if I have two genomes, A and B, how do I tell which genome has more "genetic information"?
The problem is, I do understand all this stuff. I do have the ability to read Meyer's arguments and immediately spot the flaws and holes. So no, I'm not going to base my conclusions on a tribal framework as you suggest. I'm going to look at the arguments and compare them to the data, which is exactly what you're supposed to do in science.
I have two genomes...A and B. How do we tell which one has more "genetic information"?
If you can't answer that, but still maintain the same arguments, then there's something else going on here that has absolutely nothing to do with science, and we probably should stop pretending it does.
From Meyer: "So what kind of information does DNA contain, Shannon information or specified information? Mere complexity or specified complexity? The answer is— both. First, DNA certainly does have a quantifiable amount of information-carrying capacity as measured by Shannon’s theory. Since DNA contains the assembly instructions for building proteins, the gene-expression system of the cell functions as a communication channel. Further, the nucleotide bases function as alphabetic characters within that system. This enables scientists to calculate the information-carrying capacity of DNA using Shannon’s equations . Since, at any given site along the DNA backbone, any one of four nucleotide bases may occur with equal ease, the probability of the occurrence of a specific nucleotide at that site equals 1/ 4. For the occurrence of two particular nucleotide bases, the odds are 1/ 4 × 1/ 4. For three , 1/ 4 × 1/ 4 × 1/ 4, or 1/ 64, or (1/ 4), 3 and so on. 31 The information -carrying capacity of a sequence of a specific length n can then be calculated using Shannon’s familiar expression (I =–log2p) once one computes a probability value (p) for the occurrence of a particular sequence n nucleotides long where p = (1/ 4) n. The p value thus yields a corresponding measure of information-carrying capacity or syntactic information for a sequence of n nucleotide bases. Just as mathematicians and engineers can apply Shannon’s theory to analyze a written text, a cryptographic transmission, or a section of software, mathematical biologists can apply the theory to analyze the information-carrying capacity of a DNA, RNA, or protein molecule."
All that is is a description of how to measure the information capacity of DNA (or any other 4 character based system). But that's not the question at hand, is it? I'm not asking "What is the informational capacity of a genome"; I'm asking "What is genetic information and how are you measuring it". Those are very different questions.
From Meyer: "As I write this sentence, the placement of each additional letter eliminates twenty-five other possible letters and a corresponding amount of uncertainty. It, therefore, increases the information of the sentence by a quantifiable amount as measured by Shannon’s theory . Similarly, at each site along the DNA molecule any one of the four bases is possible. Thus, the placement or presence of any one of the bases eliminates uncertainty and conveys a quantifiable amount of information according to Shannon’s theory."
Now, this would seem to say that "genetic information" = nucleotides, and the addition of any nucleotides to a sequences is "new genetic information". But if that's the case, then all his subsequent arguments about evolution not being able to increase the amount of genetic information are trivially easy to disprove. We see mutations adding nucleotides to sequences every single day. You, I, and everyone here was born with 100-200 mutations that weren't present in our parents. So you and Meyer are left with a choice....stick with this and admit the core creationist arguments are wrong, or move the goalposts. Let's see what Meyer does....
Meyer: "The sequences of nucleotide bases in DNA and the sequences of amino acids in proteins are highly improbable and, therefore, have large information-carrying capacities. Knowing this, some scientists have mistakenly described DNA and proteins as if they contained only Shannon information or possessed mere information-carrying capacity."
Ah, now it's something different! That's a good thing (for him) because as we've seen, relying merely on Shannon Theory means a first year undergrad can easily disprove his arguments. So what is the actual definition of "genetic information" and a method for measuring it? Um.......dunno. :wacko:
The best we have is from pg. 86 where he quotes the following, "an arrangement or string of characters, specifically one that accomplishes a particular outcome or performs a communication function". So is that the definition? That definition isn't tied to probabilities or just any string of letters, but instead relies on function. If so, once again any first year undergrad should be able to shoot this down, as we routinely see evolutionary mechanisms produce new genetic sequences that result in different functions. Also, that brings up "the onion test" question. Basically, the domestic onion's genome is about five times larger than the human genome. Does that mean onions have five times more information than humans? Hmmmm......
So again the question remains....what exactly is the creationist definition of "genetic information" and how are they measuring it? Specifically, if I have two genomes, A and B, how do I tell which genome has more "genetic information"?
Huh? Sorry, I don't know what you're talking about here.He has about 20 more pages in the book that goes into further detail, and goes on to explain why people like you are highly mistaken in writing of DNA as mere high capacity.
Yet despite all that effort, people like you still can't say what "genetic information" is or how to measure it.In any event, his dealing with the issue is incredibly thorough and walks through how information is understood compared to how DNA operates step by step throughout almost an entire chapter of a nearly 500 page book.
I think this is the core of the problem here. You're basically saying "He's a Christian, so you should just trust him". I think it was Christian Juggernaut who expressed the same thing to me earlier...he sees creationists as fellow Christians and scientists as not Christians, so when he encounters technical material that is beyond his understanding, he just naturally sides with the creationist. And here you are basically chastising me for not doing the same thing.I just cannot get you River. As a Christian, I would think you would be less quick to attack, discredit and accuse people....especially when they are highlighting the incredible work of God as displayed through the exquisite complexity of living things. It just comes across as bitter. The guy has a PhD in philosophy of science from Cambridge and is a former geophysicist and college professor. Maybe you can pretend the guy might know a thing or two before asserting yourself as more knowledgeable on the subject than he is. Or all his claims are entirely baseless.
The problem is, I do understand all this stuff. I do have the ability to read Meyer's arguments and immediately spot the flaws and holes. So no, I'm not going to base my conclusions on a tribal framework as you suggest. I'm going to look at the arguments and compare them to the data, which is exactly what you're supposed to do in science.
Ok then, we have genome A and genome B. How do we tell which has more "genetic information"? At this point, it really is that simple.Clearly information is quantifiable and it is quantifiable in DNA.
To be honest with you, that's just bizarre. The only thing I can figure is that you just aren't familiar with how things work in science and that refusing to define terms that are central to your arguments is pretty devastating. Instead, you really do seem to be exactly the sort of person Meyer wrote his book for....a conservative Christian with little science background, who will give him the benefit of the doubt simply because he's one of you. As long as his message is "Our interpretation of the Bible is right and those scientists are all wrong", you'll just eat it up unquestioningly.This is nothing more than a means of obfuscating the discussion. "Well he cant define information or complexity, therefore he's making things up." Actually he spends about 100 pages defining both but it doesn't fit your cookie cutter argument that you learned to dismiss all intelligent design claims with a magic wave of your "define information" response.
???????? I told you I read his book and I compared it to the data in the literature and the data I'm personally familiar with. Was I not supposed to do that? Are you once again saying I should have just accepted everything he wrote because he's a Christian?You are just like the creationists you despise that do no real research but are quick to take some argument you heard from a book, article or website and forever dismiss any and all arguments that you feel are captured in that same topic.
*sigh*I didn't notice you posting any scientific questions
I have two genomes...A and B. How do we tell which one has more "genetic information"?
If you can't answer that, but still maintain the same arguments, then there's something else going on here that has absolutely nothing to do with science, and we probably should stop pretending it does.