1. Welcome to Baptist Board, a friendly forum to discuss the Baptist Faith in a friendly surrounding.

    Your voice is missing! You will need to register to get access to all the features that our community has to offer.

    We hope to see you as a part of our community soon and God Bless!

Setterfield and the Speed of Light

Discussion in 'Creation vs. Evolution' started by Administrator2, Jan 22, 2002.

  1. Administrator2

    Administrator2 New Member

    Joined:
    Jun 30, 2000
    Messages:
    1,254
    Likes Received:
    0
    [Administrator: the following is a compilation of a number of threads.]


    DINGO
    I have seen some to and fro happening here about Setterfield’s work on c decay. The following are the facts.
    Setterfield did his work while working as a technician at Flinders Uni. in South Australia. He plotted a series of selected measured values for c over a period of many years.
    He ignored several points which did not support his seemingly preconcieved results. (This is called Creation "Science")
    He did not take into consideration the improvement in measurement techniques over the period of his study period.
    He did not take into consideration the fact that the value of measured c has been almost identical since said techniques have reached a very accurate level.
    This lack of scientific rigour from Setterfield and the results he has drawn from his work indicates at best utterly shoddy behaviour, or more likely action very close to scientific fraud.
    He originally published under the FU letterhead, but the University refused to be identified with work of such a dubious nature and insisted their name be removed immediately.
    The work has been totally and comprehensively rejected by all scientific disciplines. To use it as a serious reference is to invite ridicule and scorn.
    However, it has not stopped a number of creationist groups and individuals using it in a misguided attempt to support their cause. I can give examples and I have been at meetings where it was used by creationists who were in the full knowledge of the facts as explained above.


    HELEN
    Well, as Barry Setterfield’s wife, perhaps I am in a position to correct the above post.
    His facts are off.

    Barry never worked at Flinders. In any capacity.

    He plotted every value for c available which had been measured and published, ever. If you know of some values he missed, I am sure he would be interested in knowing about them.

    His points were not preconceived. He was interested in what the data had to show and all data was used.

    He did take into consideration the improvement in techniques. The point that he has made, and a number of others have made as well, is that whenever the SAME method was used (whatever it was) at a later date, the measurement always showed a slower speed of light. The trend was always downward. This was noticed and remarked upon.

    The values of c have remained the same ever since atomic clocks were used to measure c because atomic clocks and c are synchronous. Therefore c could not be anything other than stable as a measurement if it is being measured in accordance with something that is changing at the same rate it is!

    Barry’s 1987 paper was an INVITED white paper by Stanford Research Center International. It was written in partnership with Trevor Norman, who headed one of the computer departments at Flinders University at the time.

    Flinders U math department reviewed the paper before it was printed. They did not retract their support until the call from Aardsma informing them that Norman and Setterfield were young earth creationists. It was at that point the paper was disavowed, Norman was threatened with job loss, and Setterfield was denied access to anything at Flinders except the library. They sort of don't like YEC folks.

    What the author above considers shoddy work was defended publicly and in print by professional statistician Alan Montgomery among others. Alan has never met Barry.

    The work has not been rejected by a good many people who are involved in physics, math, and astronomy. Dingo, your statement using a universal is quite adequate evidence that you are simply spouting some kind of rhetoric and haven't the foggiest what has actually been going on.

    Ridicule and scorn get heaped on the strangest people! Galileo, Copernicus, Pasteur, Wegener -- in short 'professional' or mainstream ridicule and scorn means literally nothing. What matters is whether the person is right or wrong in the long run.
    I would love to know what creationist groups are backing the Setterfield model! That would be news to Barry and to many others! Please give with some names, and thank you!


    FOCUSZX3S2
    I always thought that it would be worthwhile for Setterfield and Co. to replicate the early speed-of-light measurements using the same techniques and technology used by experimenters 2 to 3 centuries ago. That would eliminate variations in measurement bias as a possible cause for the supposed observed c-decay. The ICR and CRS certainly have the funding and expertise to replicate the 300-year-old technology needed to repeat those early experiments. Have Setterfield and his colleagues considered doing this?


    HELEN
    I have heard this suggested before and I believe there is a student (university level) working with this right now. I don't know any more than that, except that it seems interesting and I agree with you.

    As for Barry doing it, that would be interesting, too. Right now I know he is preparing for several major speaking engagements coming up in November and trying to master power point! I think he will be presenting/defending his present work for awhile, but if things ever slow down for him and if that one student or someone else hasn't beaten him to it, I'll see what I can do about encouraging him to try some of that.

    Question, though -- and I think of it laughing -- would his detractors accept ANYTHING that he came up with as honest? I sort of doubt it! It would be far better for someone else to do the work with that in mind. Barry has already been accused of so much -- just like the post that started this thread! It's gone so far past ridiculous now that I'm not sure there's an adjective for it.

    For the sake of Barry's reputation, it would be better for someone else to do the work, I think! But still it would be fun to do just for his own curiosity, maybe...


    MILAN
    If somebody wishes to find a detailed discussion of Setterfield's mistakes, there is an excellent article in http://www.talkorigins.org/faqs/c-decay.html


    JOHN PAUL
    Barry Setterfield answers Robert Day http://www.trueorigin.org/ca_bs_02.asp

    Ya see Milan it is not enough for one person to find alleged mistakes with someone else's work. You should present it to that person or else the rebuttal is invisible to whom it matters most.


    THINK PLEASE
    While that [The Day article on t.o.]was relavant at the time, I don't believe that the T.O work is relavant to the current iteration of the cDK model. As I said in another thread, if Setterfield work gets published, and when there is a detailed copy of his work with all of his arguments detailed in a logical and coherent fashion, then I will be interested in seeing what it says. But right now, the current iteration of the model is non-viable because it is not logically presented. It is very much presented in a stream-of-consciousness fashion, with earlier forms of the model getting confused with later forms of the model. It is not something that I think should be considered by anyone to be worhty to be discussed because of the lack of detail, and a lack of coherent structure.


    HELEN
    Milan, the TO article was irrelevant from the time it was written. It is a hatchet piece written concerning some initial findings and a preliminary article presenting them. But they will leave it up at TO as long as there are people gullible enough to believe what they read there. That suits their purposes just fine.

    The 1987 white paper for SRI is here.
    www.setterfield.org


    THINKPLEASE
    Helen, T.O article is relevant as long as Setterfield leaves copies of his outdated work on the internet for distribution. Frankly, the more I consider his website, the more I think it would help him if he cleaned it up. The presentation of his work is quite disorganized, and could use some clarification in places. Some of the older work needs to be clarified as such, to limit confusion between older and newer ideas presented.


    PAUL OF EUGENE
    Hi Helen - does Barry still maintain that light speed was faster in the days of living men by a factor of million times and more?


    HELEN
    The website was given to Barry and set up by Lambert Dolphin. Barry will be visiting the U.S. in November and will try to learn how to work the silly thing then. Right now it is not in his control! And I don't know how to program a website, so I'm no help with this at all.

    Hi Paul. The Setterfield model postulates an initial speed of light about 10^11 or 10^10 times its current speed and decreasing as a Lorenzian (sp?) curve. I know c is figured by all who have been playing around with, very much including Barry, to have dropped extremely rapidly at first (at the least). So yes, the earliest men would have been living under those conditions and more.


    NAT
    Helen,
    You mentioned that:
    "I know c is figured by all who have been playing around with, very much including Barry, to have dropped extremely rapidly at first (at the least). "

    This is not generally accepted in the physics community, although it may seem that way due to some of the discussions one may find on Big Bang Cosmology. In fact, a non-constant c is not necessary for the expansion of the earlier Universe to escape its own gravity. The simple reason is that the other fundamental forces are many many times more powerful than gravity, but the more sophisticated reason is the warping of spacetime explained in the standard inflationary model.

    Further, you continually bring up that physicists are now having quite a bit of debate about the constancy of c. This debate, however, is not all that supportive of the ideas of Setterfield and other YEC as you suggest. The debate derives from a recent discovery that strongly suggests that the fine structure constant was actually a little different billions of years ago. Since the fine structure constant and c are related, this suggests a changing c. It does not however suggest anything on the order of Setterfield's claims. In fact the measured difference in the fine structure constant was quite small.

    Finally, I, as a physicist, am simply confused on how a massively different c is even possible given the number of things that are intimately related to c. For example - if c were 10^11 times greater 6000 years ago as you and Setterfield are suggesting, then fusion would be 10^22 times more energetic. Given what we know about how our sun fuses hydrogen we would expect a few things if this was true: first the Earth would have been crispy fried, but let's ignore that for now; second, the core of the Sun would have a rather large amount of lead and others metals that under current models are impossible for a medium-small star like our to form. Spectral signatures from the Sun would provide evidence of these metals, but we don't see them.

    [Administrator: this link might be informative: http://spaceflightnow.com/news/n0201/10ironsun/ ]

    Now, Setterfield could claim that Einstein's E=mc^2 wasn't always correct and just happens to be right now, but that is a serious stretch of credulity - don't cha think?


    DINGO
    Helen. Firstly, I will take your point re Setterfield not being employed by Flinders Uni. It was my belief both he and Norman were working there. It is not really a matter of any importance.
    However, what is clear is Setterfield's work is incredibly shoddy, to the extent that it borders on fraudulent.
    Milan's link makes that abundantly evident and Setterfields response is at best weak.
    It is noteworthy how strongly real scientists reacted to Seterfields pap. It was then and is now laughable.
    This has all the hallmarks of typical creation "science" methodology.
    Firstly one knows the answer from the bible.
    Secondly one makes the results fit the answer.
    Thirdly if 2. does not work bend the data until it does.
    Lastly misquote real scientists to support ones case.
    The challenge for Setterfield here is to produce real science with real evidence which will stand up to real analysis and real critical evaluation and peer review and get published in real journals.
    If he can do that, and clearly his maths and analytical efforts to date preclude it, then there is a Nobel Prize awaiting him at the door.
    .Setterfields work was rejected because it was clearly false with dodgy data interpreted in a mathematically illiterate manner to arrive at false results. Apart from those minor points it was a wonderful piece of work!

    RONGOGIABE
    the speed of light is derivable, if you are sufficiently bored, from Maxwell's laws.
    Or so I have been told, never attempted it myself because light is an electromagnetic wave and Maxwell's laws cover that ground, so it seems to make sense.
    That said a changing value for c indicates changing Maxwell's laws. I suspect this would lead to some strange stuff, but maybe not.
    or maybe it's moot, after all Maxwell's laws are predicated on an inertial rest frame (I don't think they would work at say .96c)

    Strange that Maxwell's laws haven't changed much (I hear the first formulations were a bit untidy while still mathematically and physically correct)


    PHILLIP
    Helen, Questions for you and these are really serious and not some kind of trap. I'm doing some number crunching.

    Okay, let us assume light has dropped in velocity based on a very abrupt curve. Does anybody have any specific theories, yet, on exactly the speed of light at specific times after the creation such as a chart. My point is at the moment of creation what was the speed. Did it drop by a factor of 100 times in 10 nanoseconds or 10 seconds or 10 hours, etc. etc.? Is there enough theoretical background yet to actually have developed a chart which would indicate the speeds vs. time-from-creation to some point where it settles out at a close proximity to what it is today? You gave me a lot of references and although I am seriously reading them I have had to skim today to find this information and may have overlooked it if it exists.

    It is also your contention that there was a stretching of the universe...when? at the point of creation (or right after)? or sometime in the past at a different point in time? and if so any ideas of the former size? (I think you already gave me your thoughts on actual size today in atomic clock time/speed).

    You do have me interested and I won't rule anything out without studying it. As I told you earlier--My belief is that although I still tend to lean toward "old earth" theory I still believe God could do it all in a day if he wanted to--after all--he is infinitely powerful.


    THE BARBARIAN
    Helen, regarding Setterfield's work. I've read it, too. But scientists in the field haven't accepted it, for a number of reasons, not the least of which is that if the value of c changed enough to make a difference, then there wouldn't be any matter of the sort we see today.

    So either physics is wrong, or Setterfield is wrong. Given that, scientists, even those in the ICR, have concluded Setterfield is wrong.


    HELEN
    For both Phillip and Barbarian, I have Barry Setterfield on the phone right now, my headset is on, and I am taking down what he says verbatim. He is in italics.:

    1. Helen, Questions for you and these are really serious and not some kind of trap. I'm doing some number crunching.
    Okay, let us assume light has dropped in velocity based on a very abrupt curve. Does anybody have any specific theories, yet, on exactly the speed of light at specific times after the creation such as a chart.

    Here is a chart I did awhile ago. It is still basically good with only some minor changes needed. http://www.ldolphin.org/harmon2.html
    Just recently the main curve has been determined from astronomy. It is the same curve as astronomers use for the redshift-distance relation, because redshift and light speed are linked in the same way that redshift and the rate of ticking of the atomic are linked. Since distance, astronomically, means looking back in time, there is also a direct correlation between astronomical distances and time. Therefore the redshift-distance relationship is the same curve as the speed of light-time relationship or the rate of ticking of the atomic clock against orbital, or dynamical time relationship. The only difference is the axes would have to be re-scaled. On this basis, the chart which is on the website is shown to be basically correct.



    2. My point is at the moment of creation what was the speed.

    According to the mathematics of the situation, the redshift data indicate that the maximum value for lightspeed was around 4 x 10^11 times its current speed.


    3. Did it drop by a factor of 100 times in 10 nanoseconds or 10 seconds or 10 hours, etc. etc.?

    One minute after creation, the value of c was around 4 x 10^11 its current speed. The value of c , 127 years after creation, was 4.2 x 10^7 c now. The actual function is the same as the redshift function. That will give you some idea of how quickly it dropped initially.


    4. Is there enough theoretical background yet to actually have developed a chart which would indicate the speeds vs. time-from-creation to some point where it settles out at a close proximity to what it is today?

    That same graph actually does that.


    5. You gave me a lot of references and although I am seriously reading them I have had to skim today to find this information and may have overlooked it if it exists.

    There are a number of curves in my 1987 Report, but they have been superceded by the new work and the redshift data curve. The paper is in peer review now.


    6. It is also your contention that there was a stretching of the universe...when? at the point of creation (or right after)? or sometime in the past at a different point in time?

    At the point of creation. Further calculations are needed to try to find the length of time during which the stretching took place. It has certainly closed by the finish of creation week, almost certainly by the end of the fourth day, and it may have even finished sometime early on day 1.


    7. and if so any ideas of the former size? (I think you already gave me your thoughts on actual size today in atomic clock time/speed).

    Creation by God was 'ex nihilo' or from nothing. The original size of the ball of energy which is now the fabric of space is probably a matter of conjecture, but would certainly be quite small in comparison to our current universe.


    8. You do have me interested and I won't rule anything out without studying it. As I told you earlier--My belief is that although I still tend to lean toward "old earth" theory I still believe God could do it all in a day if he wanted to--after all--he is infinitely powerful.

    I certainly agree with that, yes. To add one more thing, the redshift curve has allowed the harmonization of geology, astronomy, and the Scriptural time base in a very beautiful way. After you have read the material on my website, please feel free to email me with questions. [email protected]


    Onto Barbarian's letter:

    1. Helen, regarding Setterfield's work. I've read it, too. But scientists in the field haven't accepted it, for a number of reasons, not the least of which is that if the value of c changed enough to make a difference, then there wouldn't be any matter of the sort we see today.

    That is not relating to my work specifically. That is relating to Barrow and Tipler's work on the Anthropic Principle, and related studies. These studies have tried varying a variety of constants with dramatic effects. Fortunately, those models have absolutely nothing to do with my model, which is based on energy conservation. When energy is conserved in the process of c varying, other associated atomic constants vary in such a way that there are very few observable physical effects, such as the problem that you raise.


    2. So either physics is wrong, or Setterfield is wrong. Given that, scientists, even those in the ICR, have concluded Setterfield is wrong.

    (Barry is laughing) As far as ICR is concerned, the rejection of the work was basically on statistical grounds. The new work with the redshift data takes the development of this way beyond any statistical doubt, as the curve is already defined by other parameters. It is interesting that a number of physicists have indeed looked at my work, to see precisely what has been suggested, and have come to the conclusion that it is a viable option.
    Those holding to the Anthropic Principle approach seem to be unaware that a conservation of energy approach overcomes much of the difficulty.

    The recent paper is based on standard physics, and for this reason, several physicists are recommending publication.
     
  2. Administrator2

    Administrator2 New Member

    Joined:
    Jun 30, 2000
    Messages:
    1,254
    Likes Received:
    0
    THE BARBARIAN
    [Barry Setterfield] As far as ICR is concerned, the rejection of the work was basically on statistical grounds.

    That would make sense, since the origional claim was statistical. One of the criticisms Setterfield's work encountered was that the earliest estimate he chose was precisely the current estimate, with an estimated margin of error of 200km/sec. Which means it could just as well have indicated that the speed of light has been increasing.

    Is it not true that the estmate for the correlation coefficient has had to be twice revised downward, even with the faulty data?

    [Barry Setterfield] The new work with the redshift data takes the development of this way beyond any statistical doubt, as the curve is already defined by other parameters. It is interesting that a number of physicists have indeed looked at my work, to see precisely what has been suggested, and have come to the conclusion that it is a viable option.

    I would be more impressed if even a tenth of the astrophysicists would agree with him on this. The theory has been out there a long time, and it's not doing well.
    Because of the seemingly insurmountable problems that would be caused by a significant change in c, it seems unlikely that this one will ever be accepted.


    HELEN
    First of all, the article ICR published was by Gerald Aardsma in 1987. It was faulty, and recognized as so by a couple of people who saw what he was doing before it was published. ICR published it anyway. The mistakes Aardsma made as well as a thorough analysis of the way Trevor Norman and Barry Setterfield handled the data in the 1987 Report was done by statistician Alan Montgomery and physicist Lambert Dolphin and published in Galilean Electrodynamics, a peer-reviewed journal. Montgomery also presented his material at the following International Conference on Creationism, also a peer-reviewed forum. Neither presentation has EVER been refuted, and thus the Norman-Setterfield handling of the data stands professionally. It is generally only the web-gossip junk now which is trying to decry it.

    One of the things the original data showed was that whenever the same method was used to measure the speed of light at a date later than the first measurement by that method, the speed was ALWAYS slower. This involved a number of different methods by a number of very qualified scientists. Yes, we have more accurate measurements now, but if the results had meant nothing in particular, they would have gone up and down around whatever the correct speed was. This is not what happened. The trend was in one direction only -- the speed of light was measured as being somewhat slower each time when the same method (and often by the same scientist) was used. This is a primary point, regardless of the accuracy to the whatever fraction or decimal of the data itself.

    Does it matter if Barbarian is impressed? Not in the slightest, as he has shown a lack of understanding of what Setterfield has been presenting. As a matter of fact, because of the mixup of Setterfield's model with those involved in the Anthropic Principle, a good part of the time Barry's email concerns questions, even from physicists, which center on the AP problems instead of what Barry is talking about.

    The crux of the matter is energy conservation, which means that the constants which have been measured as changing are changing in relationship to each other so that the final result stays the essentially the same for the entire universe as well as life on earth.

    It should be noted, as well, that although the Setterfield work has been out for a number of years, the actual model is just now being presented as a cosmological and not just an historical model.

    And, contrary to Barbarian's idea, when energy is conserved there are no insurmountable problems with a changing speed of light. As a matter of fact, the Setterfield model solves quite a few other problems, eliminating, among other things, the 'need' for most of the 'dark matter' which has been postulated to exist. Biological processes are only slightly affected if at all, and matter as we know it remains the same.


    THE BARBARIAN
    Physicists remain unconvinced. And so will the rest of us, until these and other glitches in his theory are explained. When I learned that the earliest observation he included had actually estimated the speed of light to be what it is today (give or take 200km/sec) that ripped if for me. That datum could just as easily be evidence that the speed of light is increasing.

    What Setterfield has done is make a very high-stakes bet. The highest honors go to the men who overturn existing theories. And basic physics is one of the highest. He's not only suggested major differences in c, he's also claimed a way to get around the problem of such changes messing up all the other constants.

    He'll be remember either as Alfred Wegener, the meteorologist who overturned conventional geology with his theory of plate tectonics, or he'll be one of innumerable others with a fixation.

    What he needs is some validating evidence that shows his model works better than conventional physics. http://www.best.com/~dlindsay/creation/speed_of_light.html#claim

    One of the more telling criticisms is that if the speed of light changed, the timing of Cephid variables (for which the time is well characterized) should be different at different distances. And they are not.

    To merely assume that the laws of physics changed to prevent the consequences of a change in the speed of light is simply a matter of faith.


    HELEN
    Setterfield's model uses regular physics.


    THE BARBARIAN
    Physicists seem to take issue with the idea that Setterfield uses conventional physics.


    PHILLIP
    Thank you very much, Helen for taking the time to go to the source and provide the information.
    As I told you earlier, I am going to examine this in detail with a true open mind and see what conclusion I can arrive at. I wanted some semblance of a curve so that I could crunch some numbers and compare red shift, distance, etc.

    If I am missing something, tell me, but basically from my skimming I understand that the theory of red shift is NOT based on galactic (or universe) size changes, but the slowing down of "C" which would (and could) provide the required doppler shift to fool people into thinking it is actually movement between us and the stars when all along it is simply "C" changing--am I getting that part right? I will understand more once I dig in and study each sentence of his theory one at a time. As you well know, it is difficult to absorb technical data of this level like a John Grisham novel.


    HELEN
    Hi Phillip,
    I want to say something first. I really admire you. Your willingness to look at something that at first glance seems too wrong and foreign is much, much better than my attitude has been about certain things in my life. It took me five solid years of reading and challenging to go from evolution to creation. I was a stubborn and angry and defensive as a wildcat.

    Now, here is a little stuff that might help you. At least it helped me understand Barry's work a little better.

    Both the BB and the Bible agree about one thing -- there was expansion at least at some time (if not continuing) of and in the universe. In the BB model, the impetus came internally, from some quantum fluctuation or whatever various folk want to call it. The Bible presents it a bit differently, however. God says he STRETCHED the heavens.

    That term contains a clue. If you stretch something, you invest it with potential eneregy. The stretched rubber band will fly across the room and hit the cute girl with the blond hair the minute the young lad lets it go. A stretched (blown up) balloon will pop with a loud bang, releasing its energy that way if poked with a pin.

    In each case, we know the potential energy is residing in the material itself -- in the connections between molecules, I think (I haven't even thought about exactly WHERE this potential energy resides, but since most energy above the atomic level seems to reside in bonds, I am just assuming here).

    So, if the heavens -- 'space' itself, were stretched, where is the potential energy stored?

    It appears to be in Planck particles. The initial stretching would have initiated their spin and orientation thus manufacturing both the permeability and permittivity of space itself.

    Gradually, though time, this potential energy is releasing. Barry would have to give you the details you probably want here. But the result of this is that (in accord with E=mc^2), the energy itself results in the increased density of space, as more and more Planck particles and virtual particles flip into and out of existence.

    The rapid release of kinetic energy from the potential energy at the beginning would have produced a very rapid increase in these subatomic bits, those slowing light equally rapidly at the start. Again, as I think I have mentioned before (I know Barry has said this), the speed of light itself, between particles, remains the same. Light does not get 'tired'. But like a hurdle jumper, the more hurdles you put up in a given distance, the longer time it takes for him to cover that distance, even though, ideally, he can regain his speed between hurdles.

    Thus, light would have dropped in speed quite rapidly at the beginning. But as time passed, the 'urgency' of the release of potential energy would have been reduced. Not as much has remained to be converted. The amounts are still staggering to us, but in relation to the beginning, much, much less.

    There is a correlation with thermodynamics (second law) although please understand that I am NOT saying this is part of thermodynamics. But there is a picture that helps from that field. In regard to the second law, the closer the approach to equilibrium, the slower the exchange rate of heat. In the same sort of way, the closer the fabric of space is to relaxing and releasing the totality of its potential energy, (think of a balloon several days after a party, still stretched, but no more energy locked up in the stretch and it's all floppy and saggy) the more slowly the change from potential to kinetic energy will occur, thus resulting in a real evening out of the data.

    This is what the redshift data helps us see as well, and that is why light speed is not dropping dramatically now, and hasn't in a dramatic fashion for a long time.

    That's the mechanism as closely as I understand it right now. Believe me, though, any mistakes in presentation are entirely mine and if I have said something really off the wall in terms of what you know, please bring it to my attention and I will ask Barry to correct me for you.

    [Administrator: the following is from a separate, but related, thread.]


    RADIOCHEMIST
    In a recent post in this forum, Helen Fryman, perhaps quoting Barry Setterfield, makes the following comment:
    "The way in which c affects radio decay rates is detailed in the 1987 Report,which is on the web, under the heading "Radioactive Decay" involving equations 20-26."

    I have looked into that, and the evidence that ties the speed of light to nuclear decay rates is not very substantial. I quote the above report as follows:

    "As [equations] (3) and (7) apply to nucleons as well as electrons, the velocity, v, at which nucleons move in their orbitals seems to be proportional to c [speed of light]."

    I have a couple of comments. Although it would seem reasonable that the speed of light may enter into the physics of the nucleus, I am not at all sure that it has been proven by anyone that it actually does. I am also not at all sure that it has been proven that orbitals even exist in the nucleus. My understanding of the science of the nucleus is that we only can infer, without much proof, that in some respects, the behavior of the nucleus is similar to that of atomic electrons, in having shells, etc., that are well accepted as existing with respect to orbital electrons. Perhaps Setterfield is better educated (self educated) in nuclear physics than I am, but in respect to that, I note that he does not have a degree in physics, does not have any peer reviewed publications in physics and has not had a career in physics. I believe that much of what he has written is speculation, without any solid foundation. Note also his language in the quote above. "...the velocity v, at which nucleons move in those orbitals seems to be proportional to c".

    SEEMS to be proportional to c? In their posts here, Helen/Barry seem to have no doubt that the decay rates involve the speed of light. I think, even from his own paper, that there is not much evidence of it. Although I would admit that it seems natural that the speed of light could be involved in decay rates. If Helen and Barry can show more evidence of that, then I would like to see it. My impression of nuclear science is that detailed knowledge of the mechanism of nuclear decay and the equations upon which it is based is still not fully known. Certainly, in the decay rate equations that are worked with on a routine basis by scientists, including myself, the speed of light is not involved. I took nuclear physics 25 years ago, so perhaps I am going out on a limb in questioning his basic premises, but nevertheless, if I am wrong, I am sure they will point out in detail where I am wrong.

    My impression of Setterfield's 1987 paper is that it is highly speculative without much of a solid base. There are many jumps of logic that do not seem to be adequately supported with either references or mathmatically development. Having said that, I should also say that I do not consider myself a physicist, although I have had some courses on nuclear physics on a graduate level 25 years ago. Perhaps physics has advanced enough in 25 years for Setterfield to say confidently that he knows what is happening in nuclear decay, but I doubt it. Rather his writing strikes me as being the work of a bright student who understand physics somewhat but not enough to make major contributions. The flow of his writing as it involves equations does not involve mathematical proof but rather it is argument by analogy and narrative. This is not very effective in physics, I don't think. The actual basis of his idea that the speed of light has changed can be understood by those without a background in physics. It involves merely the comparison of speed of light measurements over the years and the rather speculative conclusion that the uncertain measurements of 300 years ago are proof positive that the speed of light was higher then, than now. The problem with that is the measurements long ago had substantial uncertainty and many think that they were not accurate enough to use for such a purpose. He tries to make a similar case based on changes in the measured rates of radioactive decay since 1900. Those measurements too have uncertainty too great to use for his purposes. My conclusion is that he is building a house on sand, not concrete.


    BARNABAS
    Could you please summarize of what you've just said, and this time try English instead of Greek!


    RADIOCHEMIST
    Barnabas, the technical nature of my inquiry requires that I use terms that maybe will not be understandable to the non-specialist. There is no way getting around that. But I will try to summarize below the gist of my inquiry.

    Barry Setterfield claims that radioactive decay depends on the speed of light. If the speed of light changes, the decay rates will change, according to him. I simply asked him to provide the equations that prove that. In my view, he may not be correct and did not give those in his paper.


    HELEN
    Lambert Dolphin, I think, dealt with what you are looking for, Radiochemist, here: http://www.setterfield.org/cdk/cdkconseq.html

    [Administrator: the following is from a different, but related, thread]


    HELEN
    Barry took some time to respond to the two people who had asked the questions below. I simply typed as he talked, so any misspellings of names or wrong punctuation is certainly from me! Hope this helps.

    Questions are as follows:
    Anyway, I am wondering about the specific equations that Creationists employ to quantify the increase in radioactive decay rates attributed to higher c in the past.
    My basic physics text gives no hints as to how c affects the stability of nuclei other than a general invocation of the strong nuclear force.
    A related question is:
    how does E=mc^2 apply in a c-decay scenario?

    Clearly, if c changes by a dozen orders of magnitude, then either m or E must change as well (by a LOT). Are we going to throw out the conservation of mass/energy, or relativity?

    OK, I guess I actually have 4 questions:
    If radioactive decay rates DID increase by a factor of a hundred million or so, then how are we to get around the increased heat released (beyond the obvious increase due to compressing a billion years of decay into a single year)? If we use a c value that is vastly higher, then, using E=mc^2, the energy released per decay is increased by the square of the increase in c. In other words, if we don't find some mitigating factors, a gram of fast-decaying uranium would release about as much energy as a supernova (well, it would be a lot, anyway).

    From Barry:
    Firstly, as far as E=mc^2, some considerable time was spent in the early days of this analysis determining just how the measured values of the constants were behaving. Some were increasing with time, others were decreasing with time, others again were constant. A number of scenarios were tried before it became apparent that , in any change in the speed of light, energy was being conserved. It was on this basis that agreement between experimental values of the various constants and a theoretical approach could be formulated. On this basis, then, with E=mc^2 and energy being conserved, this of necessity means that mass is proportional to 1/c^2.

    This is accord with the experimental data for electron rest masses, measured during the last century. "The customary interpretation [of the E=mc^2 relationship] is that one kind of thing, energy, can change into a completely different kind of thing, mass, and vice versa. … almost like magic [ellipses in original; do not mark material snipped]. The ZPF [zero point field] perspective offers a very different view, which is not, of course, tantamount to proof. There is an extremely rapid particle quantum fluctuation that can be attributed to the perturbations of the ZPF. This was named 'Zitterbewegung' (German for 'quivering motion') by Schroedinger, and in a model proposed by Dirac the fluctuations of the Zitterbewegung happen at the speed of light. IN his 1989 attempt to develop the Sakharov conjecture connecting the quantum vacuum and gravitation, Puthoff suggested (as others had speculated previously as well) that the kinetic energy associated with the ZPF-driven Zitterbewegung is what provides the energy for the E=mc^2 relation. The real stuff is the energy, E, and, as with inertial mass, it is only our (obstinate) habit of believing that matter must posses mass that leads to our insisting that there must exist a right hand side to this equation, namely, mc^2. In reality, there is no mass, just energy, E, that the quantum vacuum gives in the form of Zitterbewegung. IN the same way that there is no inertial mass, just a force that the quantum vacuum gives when an object accelerates. In a sense, this does away with the need for a veritably magical transmutation of energy into matter or matter into energy. In this view, we never get energy by destrying matter. We get energy by liberating some, or all, of the kinetic energy that the quantum vacuum puts into the Zitterbewegung of what are really massless quarks and electrons. Rest mass would really be ZPF energy (or, more generally, quantum vacuum energy) associated with a particle via Zitterbewegung (almost certainly at a resonance)."
    This comes from California Institute of Physics and Astrophysics: Inertia Research http://www.calphysics.org/questions.html

    The way in which c affects radio decay rates is detailed in the 1987 Report, which is on the web, under the heading "Radioactive Decay" involving equations 20-26.

    Regarding heat release with a faster rate of radio decay, heat, being electromagnetic radiation, the permeability and permittivity of free space is lower when the speed of light is higher, and these quantities feature in the equations which give the energy density of radiation. In fact, the two effect completely cancel out. The one effect which would be outstanding in the early days would be the effects of short half-life radioactive elements, which could be shown to have heated up the interior of the earth on a reasonably short time-frame.



    Interesting, thanks for the info. So when you plug both the altered c and the altered planck's constant into the electromagnetic equations (whether you're using Maxwell's equations or quantum electrodynamics) does this mean the changes will always cancel each other out?

    From Barry: yes, it does.


    Also, wouldn't changes in planck's constant change a lot of other things? For example, the uncertainty principle would look different, wouldn't it? Would this affect the behavior of matter in any way? If you don't know the answer to these questions it's no problem, but if you do I'd be interested to know more about this stuff.

    From Barry: In the same way that mass may be due to the motion of the Zitterbewegung from the zero point fields, the same research indicates that Planck's constant, h, is simply a measure of the strength of those zero point fields.
    This is a natural consequence of the research that is currently being done in this area. An increase in strength of the ZPE inevitably would increase the value of h, and also the Zitterbewegung, and also mass. And, at the same time, this would give rise to greater quantum uncertainty because of the increased jiggling caused by the increased number of electromagnetic waves of the zero point field. Thismeans that when the strength of the ZPE was lower, the quantum uncertainty was less. IN other words, quantum uncertainty is increasing with time.

    As far as the effect on matter, I think I answered that just now for the first person.


    Note from Helen: the ZPE and ZPF are pretty well explained by Barry in his theoretic summary for laymen here: http://www.setterfield.org/vacuum.html


    WEHAPPYFEW
    Thanks for being the intermediary, Helen.
    I'm sloggin through it myself, now. Of course most of it goes way over my blonde head, as well, but I have a few basic questions that you or your sources might clear up for me. These are along the lines of learning to crawl before running...

    Barry postulates that E is constant, and m varies as 1/c^2. Simple enough...

    But how do we measure E?

    My simple mind defines energy in terms that ultimately contain a mass unit somewhere. How can we measure a constant (E) using varying units (kgs) ???

    Another idea that Barry's 1987 paper brought to mind is a comparison of the best c measurements from the early 20th century versus the best measurements of radioactive decay rates from the same period. The table of decay rates in the paper contains so many question marks and obviously bad numbers that I find it impossible to derive a statistically meaningful trend. Throwing out some of the obviously bad numbers results in the overall trend swinging wildly from positive to negative and back again. Has Barry expanded this section lately and made the kind of comparison I am talking about - hopefully with lots more data?

    Finally, I am not sure about Barry's explanation of radioactive decay energy. First of all, no explanation is neccessary if mass decreases by 1/c^2. The energy comes out exactly the same in that case (the problem of how we measure that energy still confuses me, though).

    Nevertheless, Barry goes on about photon intensities emanating from radioactive decay. But most decay energy is released in the form of kinetic energy (in the alpha or beta particle). Wouldn't these be unaffected by the "permittivity and permeability of free space" stuff? So we still have billions of years of radioactive decay years compressed into a few hundred years, right?

    [ January 23, 2002: Message edited by: Administrator ]
     
  3. Administrator2

    Administrator2 New Member

    Joined:
    Jun 30, 2000
    Messages:
    1,254
    Likes Received:
    0
    [Administrator: the following is from a different, but related, thread.]


    JOHN2001
    The biggest issue is that Setterfield's claimed variation in the speed of light disappears coincidentally at the same rate as measurement quality improves (this is also true of his other examples in which he claims that other physical quantities are also varying). Indeed, if you throw out all of his data prior to about 1860, you don't really see any change in the speed of light.

    Now, if c were to start changing tomorrow, Setterfield is sitting pretty. He got there first. (A new episode of c change would be compelling evidence.) After the fact, there would then be reason to claim that the speed of light can change.

    As it stands today, the argument that the speed of light is constant is much stronger than any arguments in favor of a varying speed of light.

    Setterfield is not correct to say that his data show "strong" or "convincing" evidence of an exponential change in c. At best we can say that we can hide a change in c of the size Setterfield proposes in the historical data. Indeed, Setterfield's phenomenon is a very fragile one, and not a strongly represented one at all.

    Indeed, the primary flaw in Setterfield's approach is that he is combining data of wildly different precisions. This is always dangerous, and particularly so when so much of the conclusion depends on the data with the least amount of precision.

    We can further see this, in terms of a concept that should be familiar to anybody who took a science class at the college level----significant figures. If we write the modern value of the speed of light, 2.99792458x10^5 km/s as it appears for different numbers of significant figures we have:

    1 sig fig -> 3x10^5 km/s
    2 sig figs -> 3.0x10^5 km/s
    3 sig figs -> 3.00x10^5 km/s
    4 sig figs -> 2.998x10^5 km/s
    5 sig figs -> 2.9979x10^5 km/s
    6 sig figs -> 2.99792x10^5 km/s

    So, from merely increasing the number of significant figures in the measurement we can apparently "lower" the speed of light by the same amount that Setterfield is claiming has actually happened.
    Anyone familiar with significant figures knows that we must have the same number of significant figures in the quantities that we input into the calculation as we expect for the output of those computations.

    If we go through Setterfields' tabulated data at: http://www.setterfield.org/report/report.html
    we can then apply this reasoning to his data.

    1) Roemer-type measurments.

    Even if we have infinite precision for the distance across the Earth's orbit, the number of significant figures in the time measurement will govern the number of significant figures in the caculated values of c.

    We can see from the tabulated data that most likely the delay times are determined to 2 or possibly 3 significant figures for the Roemer-type measurments, and are thus in agreement with the modern value of c to that number of sig figs. Only the measurments in 1861 and 1876 may be thought to possibly approach 4 significant figures. Recall, that 4 sig figs is a measurement to .1 seconds of accuracy, even in the 1800's would be via visual observation with a stopwatch. Indeed, there is strong evidence that these data do not have more than 3 significant figures of precision from comparison with the rotating mirror lightspeed determinations. I will discuss this below.

    2) Bradley (aberration) measurments.

    Fortunately, Setterfield lists plenty of these measurements. While the data are presented as if they were good to 4 or even 5 significant figures, it is apparent from the scatter of the data,particularly that in data collected in relatively small time intervals (compare the measurments of 1889-1898 for example) of years, that the abberation parameter could actually be determined to 3 but only rarely to 4 significant figures. All of these measurments of the speed of light agree with the modern value to 3 significant figures solidly. and many agree with the today's figure to 4 significant figures.

    3) Toothed wheel (Fizeau-type) experiments.

    These data suggest that the number of significant figures avialable is more likely related to the baseline length of the experiment, than with time. After 1855 most of the measurements agree to 4 significant figures with today's values.

    4) Rotating mirror experiments.
    Interestingly, all of the values listed agree to within 4 significant figures, and within the stated error bars, with the modern value of the speed of light.

    All more modern results Setterfield presents show orders of magnitudes of refinement those of past methods.
    So, basically, Setterfield's stuff is interesting, but inconclusive. The fact that we can make his phenomenon go away by simply considering the significant figures in the data shows that his results are not "strong" results of anything beyond being an historical record of the improvement in our abililities to measure the speed of light.


    RUFUSATTICUS
    Helen,
    Has Barry ever addressed the fact that the measurements taken in the past might not have been accurate? It seems to me that he'd have to verify that those historical measurements were accurate before he could draw any conclusions about a trend. I don't think he'll ever get it published unless he addresses this.

    [administrator: Helen was out of town when this was posted.]


    RADIOCHEMIST
    However it should also be noted that it was the mathematics department at Flinders University in Adelaide, Australia, which co-published the 1987 Report along with SRI. They felt it was quite sufficient.

    Helen,
    Joe Meert contacted SRI to find out if they really did commission or support the 1987 report by Setterfield. The reply denied that it was an SRI report. Apparently a few copies were ran off by mistake that had an SRI cover. You may have a different interpretation but it seems that SRI denies any formal involvement with the report. Even the report itself says that it was requested by an SRI staff member, and not that it was an official SRI report.
    That apparently is the extent of any involvement with SRI.


    HELEN
    The paper was an invited white paper by SRI senior physicist Lambert Dolphin. With his permission I am quoting an email he sent me some time ago explaining what happened at that time:

    >
    >I want to fill in some regarding the fateful year of 1987: You are welcome
    >to weave these in to your paper you are writing incorporating Barry's
    >latest comments.
    >
    >I had known of Barry's work for several years when he and Trevor and I
    >decided to publish a joint informal report in 1987.
    >
    >As a physics graduate student at Stanford in the mid '50s I was aware of
    >the historic discussions about the constancy of c and other constants.
    >Across the Bay, at UC Berkeley, in our rival physics lab Raymond T. Birge
    >was at that time well known and respected. I knew he had examined the
    >measured values of c some few years before and decided the evidence for a
    >trend was then inconclusive. I also knew there of nothing in physics that
    >required c to be a fixed constant.
    >
    >Therefore the Setterfield and Norman investigation of all the available
    >published data seemed to be a most worthy undertaking.
    >
    >I was a Senior Research Physicist and Assistant Lab Manager in 1987 and in
    >the course of my work I often wrote White Papers, Think Pieces, and
    >Informal Proposals for Research--in addition to numerous technical reports
    >required for our contracted and in-house research. I could initiate and
    >circulate my own white papers, but often they were written at the request
    >of our lab directory as topics for possible future new research. An
    >in-house technical report would be given a Project Number the same a
    >Research Project for an external client world--provided the internal report
    >took more than minimal effort to prepare and print. Minimal-effort reports
    >were not catalogued.
    >
    >In the case of the 1987 S&N report, I reviewed the entire report carefully,
    >wrote the forward, approved it all; but the report was printed in
    >Australia, so an internal SRI Project Number was not needed. It was simply
    >an informal study report in the class of a White Paper. Ordinarily it would
    >have circulated, been read by my lab director and been the subject of
    >occasional in house discussions perhaps, but probably would not have raised
    >any questions.
    >
    >Gerald Aardsma, then at ICR in San Diego, somehow got a copy of the report
    >soon after it was available in printed form. He did not call me, his peer
    >and colleague in science and in creation research, a fellow-Christian to
    >discuss his concerns about this work. He did not call my lab director, Dr.
    >Robert Leonard, nor the Engineering VP, Dr. David A. Johnson over him--both
    >of whom knew me well and were aware of the many areas of interest I had as
    >expressed in other white papers and reports. Dr. Aardsma elected to call
    >the President of SRI! In an angry tone (I am told) he accused the Institute
    >of conducting unscientific studies. He demanded that this one report be
    >withdrawn. Aardsma then phoned my immediate colleague, Dr. Roger Vickers,
    >who described Aardma as angry and on the warpath. Vickers suggested that
    >Aardsma should have phoned me first of all.
    >
    >Of course the President of SRI asked to see the report, and checked down
    >the chain of command so he could report back to Aardsma. There was no paper
    >trail on the report and my immediate lab director had not actually read it,
    >though he had a copy. Since the report had no official project number it
    >could not be entered into the Library system. Finally the President of SRI
    >was told by someone on the staff that ICR was a right-wing fundamentalist
    >anti-evolution religious group in San Diego and should not be taken
    >seriously on anything! ICR's good reputation suffered a good deal that day
    >as well as my own.
    >
    >On top of this, major management and personnel changes were underway at the
    >time. An entire generation of us left the Institute a few months later
    >because shrinking business opportunities. Our lab instructor, Dr. Leonard
    >and I left at the same time and the new director, Dr. Murray Baron decided
    >that any further inquires about this report should be referred directly to
    >me. There was no one on the staff at the time, he said, who had a
    >sufficient knowledge of physics to address questions and we had no paying
    >project pending that would allow the lab to further pursue this work. So
    >the report should not be entered into the Library accounting system.
    >
    >I next phoned Gerald Aardsma--as one Christian to another--and asked him
    >about his concerns. I told him gently that he had done great harm to me
    >personally in a largely secular Institution where I had worked hard for
    >many years to build a credible Christian witness. He seemed surprised at my
    >suggestion that out of common courtesy he should have discussed this report
    >with me first of all.
    >
    >Aardsma told me that he could easily refute the claim that c was not a
    >constant and was in fact about to publish his own paper on this. I
    >subsequenctly asked my friend Brad Sparks, then at the Hank Hannegraaff's
    >organization in Irvine to visit ICR and take a look at Aardma's work. Brad
    >did so and shortly after reported to me not only that Aardsma's work was
    >based on the faulty misuse of statistical methods, but furthermore than
    >Aardsma would not listen to any suggestions of criticisms of his paper.
    >
    >It was not longer after, while speaking on creation subjects in Eastern
    >Canada that I became good friends with government statistician Alan M.
    >Montgomery. Alan immediately saw the flaws in Aardsma's paper and began to
    >answer a whole flurry of attacks on the CDK hypothesis which began to
    >appear in The Creation Research Institute Quarterly. Alan and I
    >subsequently wrote a peer-reviewed paper which was published in Galillean
    >Electrodynamics which pointed out that careful statistical analysis of the
    >available data strongly suggested that c was not a constant, and neither
    >were other constants containing the dimension of time. At that time
    >Montgomery made every effort to answer creationist critics in CRSQ and in
    >the Australian Creation Ex Nihilo Quarterly. All of these attacks on CDK
    >were based on ludicrously false statistical methods by amateurs for the
    >most part.
    >
    >The whole subject of cDK was eventually noticed by the secular community.
    >Most inquirers took Aardma's faulty ICR Impact article as gospel truth and
    >went no further.
    >
    >To make sure all criticisms of statistical methods were answered,
    >Montgomery wrote a second paper, presented at the Pittsburgh Creation
    >Conference in 1994. Alan used weighted statistics and showed once again
    >that the case for cDK was solid--confidence levels of the order of 95% were
    >the rule.
    >
    >Both Alan and I have repeatedly looked at the data from the standpoint of
    >the size of error bars, the epoch when the measurements were made, and the
    >method of measurements. We have tried various sets and sorts of the data,
    >deliberately being conservative in excluding outliers and looking for
    >experimenter errors and/or bias. No matter how we cut the cards, our
    >statistical analyses yield the same conclusion. It is most unlikely that
    >the velocity of light has been a constant over the history of the universe.
    >
    >In addition to inviting critiques of the statistics and the data, Alan and
    >I have also asked for arguments from physics as to why c should not be a
    >constant. And, if c were not a fixed constant, what are the implications
    >for the rest of physics? We have as yet had no serious takers to either
    >challenge.
    >
    >Just for the public record, I have placed our two reports on my web pages,
    >along with relevant notes where it is all available for scrutiny by anyone.
    >I have long since given up on getting much of a fair hearing from the
    >creationist community. However I note from my web site access logs that
    >most visitors to my web pages on this subject come from academic
    >institutions so I have a hunch this work is quietly being considered in
    >many physics labs.
    >
    >In a recent defense of this whole thesis, (see
    >http://ldolphin.org/speedo.html) Chuck Missler says this, "It will take
    >some time for the Setterfield Hypothesis to be proven acceptable, but it is
    >extremely provocative and would dramatically alter our concepts concerning
    >the physical universe."
    >
    >Lambert Dolphin
    >http://ldolphin.org/
    >[email protected]


    Brad Sparks had some corrections to this when Lambert sent it to him. The corrections are as follows. Lambert’s comments are preceded by arrows from him:

    >
    >However, my inquiry into Aardsma's attack on Setterfield and you long preceded
    >my 1992-4 tenure at CRI by YEARS. The incident with Aardsma calling the SRI
    >President was in Aug or Sept 1987, right after the report came out (wasn't it
    >like just before everyone was leaving for Labor Day holiday? vague
    >recollection of something like that). We talked about by phone at the time.
    >I had already given a "brown-bag" lunchtime slide presentation of my Exodus
    >work at ICR in July 1987 at Aardsma's invitation. I happened to be down at
    >ICR again around Christmas 1987 and that's when I dropped in to see Aardsma
    >and asked him what happened with you and SRI. He just unashamedly said he had
    >called the SRI President to complain about the publication of the Setterfield-
    >Norman report. I asked him why he had called the President instead of your
    >supervisor or you directly with scientific arguments, I didn't understand. He
    >replied that it wasn't a scientific issue but a matter of ethics, that it was
    >unethical for SRI to publish the report in violation of SRI policies and it
    >thus misled people that it was officially sponsored by SRI and so therefore he
    >felt justified in complaining to the SRI President. I asked if this was a
    >proper complaint under Biblical principles -- Aardsma was taken a back as if
    >he'd never considered that before, but replied that he still felt justified
    >because you and Setterfield were misleading Christians about SRI sponsorship.
    >I dropped the matter at this point.

    ----> I expect this is accurate and it sheds new light on things I was not
    aware of. The report was in no way out of the ordinary and it was a normal
    white paper sort of thing. How did he know what SRI's policies were? Even
    if it the report was not an officially published SRI report, it was one of
    any number of white papers that for years have been published in the same
    way. Why should Aardsma appoint himself as judge over me, over SRI, over
    Barry's work?


    >
    >I had already discussed scientific issues with Aardsma just before this
    >conversation. In fact it all started with me noticing a computer graph posted
    >on Aardsma's wall of a scatter plot of "c" values with time. Aardsma proudly
    >asserted that this graph proved that the scatter of "c" values was about
    >equally distributed both above and below the current "c" value. I was aghast
    >as I knew this was not true and I noticed right off that because the scaling
    >was biased, many of the above-current"c" points were hidden by being
    >superimposed on each other. The above-"c" points couldn't be seen because the
    >scale was unfairly compressed. I believe this same graph was shortly
    >thereafter published by ICR as an Acts & Facts Impact article (Jan 88 ?
    >issue). I pointed out this biased scaling to Aardsma and he said he would
    >look into it but the graph was about to be published so he said it was
    >unlikely he could or would do anything about it.

    ----> Yes, I think this is on target.

    Lambert


    In addition, on another webpage debate, a physicist calling himself “Lucas” challenged Alan Montgomery’s defense of the Norman-Setterfield statistical handling of the data in the 1987 paper. Montgomery is a professional statistician. Here are the two responses from Alan which were posted at that time in response to “Lucas.”

    First:

    More Analysis less Skepticism

    Lucas in "A tiny proper subset of the problems with Setterfield Dolphin"
    mentions my statistical analysis. He claims that my MSSD tests are
    "worthless". He does not quote which published study he refers to so I will
    assume that it is the one published in Galilean Electrdynamics Vol. 4 No.
    5. The MSSD statistic measures the ratio of the sum of successive
    differences over the sum of the differences with the mean. The statistic
    has an expected value of 2 when successive data values are on the same side
    of the average approximately 50% of the time, such as would happen in a
    Normal error distribution. In data where there is a trend, successive data
    will lack this 50% side change characteristic. Ideally data should
    originate from the same symmetric distribution. However, data requiring
    analysis frequently lacks the required theoretical characteristics. In such
    cases, the practitoner must assess whether data is sufficiently
    well-behaved for the test to yield a meaningful result.

    I concluded that the results were meaningful for the following reasons:

    1) The results of the MSSD, a parametric(using the values of the data)
    test, were the same of those of the Run Test, a non-parametric test (using
    the data's rank or position which lowers the effect of the result of
    outliers and extreme data problems).

    2)When the MSSD result showed a high likelihood of trend of c data and its
    significant subsets, the corresponding linear regression showed decrease.

    3)When the MSSD was applied to other experimental physical data, such as
    the charge of the electron, with the same problem of inhomogeneity, the
    results agreed with the assumed constancy.

    Lucas fails to mention that constants not related to the frequency of c
    still show constancy via the MSSD but those "constants" dependent on the
    frequency of the speed of light as variable. The Run tests are almost as
    consistent.

    Some critics (as opposed to skeptics) were still unconvinced by this
    analysis and withheld judgement. This is understandable. I understand
    conservatism in science. It can be a hinderance but it can also be a safety
    net. However, I do not understand the mentality of some, who like Lucas are
    intent on dismissing this evidence entirely. My paper showed analysis by
    intervals of 20 years, by accuracy, by method and by precision. These
    subsets were consistent with the MSSD and Run test results. The congruency
    of these results makes it difficult to argue reasonably that they are
    meaningless.

    There is another point made by Lucas in his discussion with Pim which the
    audience may not comprehend.
    Since the size of the error bars decreases with time they are not randomly
    distributed which is a requirement of the standard weighted regression
    technique. This technique was used by Aardsma in his paper to the CRSQ and
    objected to by Setterfield for exactly this reason. If a non-weighted
    regression is a bucket of ... and a weighted regression is a "pale of
    urine" pray tell what could be applied to the data which would handle the
    difficulties of this data?

    In my 1994 paper to the International Conference on Creationism entitled "A
    determination and analysis of the appropriate values of the speed of light
    to test the Setterfield hypothesis". The referees to this paper were
    decided hostile to the Setterfield hypothesis and it was with some strain
    that it was published. One reason for pursuing this paper was rigid
    insistence of Aardsma and others of using all 163 of Setterfield's data
    points despite the inappropriateness of many . Minimal criteria for the
    inclusion in the analyzed data were:

    1) All data should be in vacuo;

    2) All data should be experimental;

    3) Only one datum per experiment;

    4) Cannot be an outlier;

    5) Must belong to a method whose data variance is sufficiently small to
    show thaT a trend of the expected size to be statistically significant.

    Point 5 was required because Standing Wire values actually gave a linear
    regression line of -3.1 km/sec/yr
    as opposed to the expected size of -4.7 km/sec/yr. But even if the size of
    the decrease was closer to the expected value than to zero it was
    insignificant by a wide margin statistically. The results from such data is
    cannot be regard as meaningful in the context of the hypothesis being
    tested.

    The regression technique used in the paper was one recommended by Neter and
    Wasserman in"Applied linear statistical models" p 131-4. They give an
    example where the standard deviation of the error is proportional to the
    independent variable X. This is exactly the problem Lucas refers to as
    invalidating results from a simple regression or weighted regression line.
    The technique yielded positive results for trends in all dynamic data with
    Roemer and without Roemer data at the 99.95 percentile: non-aberration
    data at 99.95 percentile: post 1945 data at 99.0 percentile: aberration
    data at 97.5 percentile. Laser data was not significant. A new data set was
    created by reducing multiple values in one year to a weighted averge value.
    It was significant at 99.95 percentile. These regressions were tested for
    autocorrelation by the Durban-Watson test. None was found. Thus these
    regressions lines qualify as a statistical model.

    Analysis was done by intervals of 20 years using the t-statistic, accuracy
    by steps of 200 km/sec using binomial statistic and error bar size for
    various values using the t-statistic on average and slope. These tables
    show results consistent with trend and inconsistent with constancy. Of 46
    tests run on the sub-analyses, only 4 were in the 25-75 percentile where
    one would expect 23 values. Between 33-67 percentile there was one. These
    values were more easily explained by some obvious systematic errors rather
    than any tendency to remain constant.

    Let me conclude by saying that in the audience at my presentation were
    several who are notably hostile to the change in the speed of light,
    including Aardsma and Chaffin. None of those present have voiced any
    concern over the statistics or the analysis but continue to remain averse
    on theoretical grounds. If the paper is flawed I have heard nothing from
    critics which would indicate it. I remain convinced of the viability of the
    hypothesis and insist that those who want to argue the statistics do so
    from the perspective of a decent analysis of reasonably selected data.

    Alan Montgomery


    Second:

    Lucas states No one disputes that the speed-of-light data have a trend. No
    one. Ever. One doesn't even need to use fancy statistics to see it. It's
    clear from eye-balling the data and the trend in reported experimenter
    errors.

    I wish this were true but there are stubborn young Earth creationists who
    do insist that this is true, have written articles of that effect and have
    had them published. At the time the articles were written there were more
    than a few due to the influence of Aardsma's weighted regreesion line which
    was given an exaggerated interpretation. I think that in that context the
    MSSD tests clearly demonstrate those who supported such a position were
    without sound jusification. Even after these tests were published there
    have still been those who continue to believe what is obviously not true.
    If Lucas accepts the existence of a trend then we agree. We disagree only
    why the trend exists.

    Humphreys in CRSQ 1988 Volume 25:1 p 40-45 took the position that the trend
    was merely apparent; that higher early values had to come down as the
    precision of the measurements increased; and that after a certain point the
    values became constant. Experimentally, this can happen but with regard to
    c it is a speculation which has yet to be demonstrated. In my 1993 paper in
    Galilean Electrodynamics I attempted find any biases caused by
    experimenters or experiments. I found nothing. I have yet to hear anyone
    suggest more reasonable techniques to find such a phenomenon.

    There was a physicist in Israel who in the response to my article made the
    argument that the conclusion was undermined by the fact that the c values
    were not all from the same distribution. I consulted a colleague Tom Goss
    (Ph.D.) who is a partner in a consulting firm majoring in statistical and
    methodology studies for business and government. His comment was that
    theoretically this was true but my analysis was strong enough to leave such
    a phenomenon as only a remote possibility. The technique described in
    Applied Linear Statistical Models (Neter and Wasserman) which I call a
    transformed variable regression is such a test which can be applied to data
    with the error related to the independent variable. When the results from
    the "transformed varible regression" were shown to this physicist he agreed
    that I had met the statistical burden required by his criticism. He did not
    admit that his mind had changed. He merely switched to theoretical
    arguments. Lucas may be right in saying MSSD does not necessarily test for
    trend in inhomogeneous data if that statement is taken in isolation but
    this disconnects my statistical test from the accompanying analysis. Since
    my conclusions were based on both it is incorrect to separate the work into
    compartments in order to deny the conclusion.

    Question: Does Lucas agree that the transformed variable regression is one
    which is appropriate to apply to data where the errors are dependent on
    the random independent variable (error bars decrease with time) ?

    If so, does he accept the results from my 1994 paper published in the
    Transactions of the Third International Conference on Creationism? It
    would appear from his response that he does not .He states You don't know
    how the error varies with independent variable. Let me refer back to Neter
    and Wasserman, page 136 note 4. "To obtain information as to the nature of
    the ralation of the error to X (the independent random variable, one may
    divide the residuals into groups of 3 or 4 groups of equal size according
    to the ascending or descending size of X and calculate the sum of the
    squares of residuals for each group." The results may not yield an
    immediately obvious relationship - this is true but getting back to the
    case at hand the c data did yield a linear relationship as judged by a
    simple regression line of significant slope and fit. This is in my 1994 ICC
    paper. (see Lambert Dolphin's website http://www.best.com/~dolphin//cdkalan.html)

    Lucas (problems with Setterfield-Dolphin) proposes a statistic X (sorry I
    have no Greek letters).

    X = [c(experimental) - c]/ experimenters estimate of experimental error

    This is , in fact, the t statistic for a sampling of the random variable C
    at the point of time of the experiment. Lucas comments after showing his
    diagram of his statistic X Obviously there is no trend! By what logic
    would one expect a trend in the t statistic. Only the logic based on the
    assumption that the early data are significantly different than the current
    c yielding a high t value and that later data is not significantly
    different c yielding a low t value. The fact that Lucas finds no trend in
    his statistic is indeed an observation against the above assumptions.
    Indeed, his observation that t statistic appears to be increasing after
    1900 is confirmation of data in the latter part of the 1900's is more
    significant than the early 1900's. This agrees with my observation that
    aberration values are systematically low and that their occurence is most
    obvious in the 1st quarter of 1900's causing anomalously low statistical
    values in that zone.

    In addition if one were to choose a known constant such as e, the charge on
    the electron and do the same chart one would find that X decreases
    significantly as one goes back in time because the value of e changes
    almost not at all while the error gets larger. The two patterns are
    different and they are different for a reason. That reason is that the
    values of c are trending downwards. This pattern will emerge regardless of
    whether the trend is real. Lucas already admits the trend is there he only
    contends that it is due to some non-physical reason. X will not solve his
    problem. In his reply to my first posting Lucas admits as much. Only an
    imaginative and thorough analysis will uncover evidence of such an event.

    In conclusion I humbly submit that I have done work on appropriate values
    with appropriate statistical methods and validated by in-depth analysis
    which has been peered reviewed, published, presented in open forum for
    criticism and made available on the internet. Many electronic copies have
    been made available to dozens of inquirers. Criticisms have been made and
    answered and so far the work remains intact. Certainly, nothing Lucas has
    offered persuades me that I need do more. His hypothesis does not reach
    beyond the possible into the reasonably supported probable. And that is
    what we statisticians really do. Reality is the most reasonably supported
    hypotheses are sometimes found to be wrong for reasons never imagined at
    the time.

    Alan Montgomery


    [ January 23, 2002: Message edited by: Administrator ]
     
  4. Administrator2

    Administrator2 New Member

    Joined:
    Jun 30, 2000
    Messages:
    1,254
    Likes Received:
    0
    [Administrator: The following started out as an astronomical question and quickly centered on Barry Setterfield so it has been placed here and follows.[

    JOE MONTE
    In 1987, an astronomer working at an observatory in Hawaii noticed strange interference on one of the plates he was exposing of the night sky. Unable to figure it out he went outside and looked up. That's when he saw it.
    Unbeknownst to him the supernova in the small Magelenic Cloud became apparent while he was in the observatory!!! According to trigonometric paralax the new astronomical feature was estimated to be over 165,000 ly distant. If the universe is only, say, 10,000 years old then how could this be? Given this logic we can only see supernovae only 10,000 ly away, right?


    LATE CRETACEOUS
    Once small correction.
    SN1987A was dicovered, not in Hawaii but in Chile
    By Ian Shelto of the University of Toronto,at Las Campanas Observatory, Chile (now relocated to Argentina)


    HELEN
    This [Joe Monte’s post] was emailed to me. Here:

    Question: Regarding the recent research acknowledging the possibility that the speed of light has not always been constant, someone wrote to me: "By the way, there's a pretty easy way to demonstrate that the speed of light has been constant for about 160,000 years using Supernova 1987A."

    Comment from Barry Setterfield: It has been stated on a number of occasions that Supernova 1987A in the Large Magellanic Cloud (LMC) has effectively demonstrated that the speed of light, c, is a constant. There are two phenomena associated with SN1987A that lead some to this erroneous conclusion. The first of these features was the exponential decay in the relevant part of the light-intensity curve. This gave sufficient evidence that it was powered by the release of energy from the radioactive decay of cobalt 56 whose half-life is well-known. The second feature was the enlarging rings of light from the explosion that illuminated the sheets of gas and dust some distance from the supernova. We know the approximate distance to the LMC (about 165,000 to 170,000 light years), and we know the angular distance of the ring from the supernova. It is a simple calculation to find how far the gas and dust sheets are from the supernova.

    Consequently, we can calculate how long it should take light to get from the supernova to the sheets, and how long the peak intensity should take to pass.

    The problem with the radioactive decay rate is that this would have been faster if the speed of light was higher. This would lead to a shorter half-life than the light-intensity curve revealed. For example, if c were 10 times its current value (c now), the half-life would be only 1/10th of what it is today, so the light-intensity curve should decay in 1/10th of the time it takes today. In a similar fashion, it might be expected that if c was 10c now at the supernova, the light should have illuminated the sheets and formed the rings in only 1/10th of the time at today's speed. Unfortunately, or so it seems, both the light intensity curve and the timing of the appearance of the rings (and their disappearance) are in accord with a value for c equal to c now. Therefore it is assumed that this is the proof needed that c has not changed since light was emitted from the LMC, some 170,000 light years away.

    However, there is one factor that negates this conclusion for both these features of SN1987A. Let us accept, for the sake of illustration, that c WAS equal to 10c now at the LMC at the time of the explosion. Furthermore, according to the c decay (cDK) hypothesis, light-speed is the same at any instant right throughout the cosmos due to the properties of the physical vacuum. Therefore, light will always arrive at earth with the current value of c now. This means that in transit, light from the supernova has been slowing down. By the time it reaches the earth, it is only travelling at 1/10th of its speed at emission by SN1987A. As a consequence the rate at which we are receiving information from that light beam is now 1/10th of the rate at which it was emitted. In other words we are seeing this entire event in slow-motion. The light-intensity curve may have indeed decayed 10 times faster, and the light may indeed have reached the sheets 10 times sooner than expected on constant c. Our dilemma is that we cannot prove it for sure because of the slow-motion effect. At the same time this cannot be used to disprove the cDK hypothesis. As a consequence other physical evidence is needed to resolve the dilemma. This is done in the forthcoming paper where it is shown that the redshift of light from distant galaxies gives a value for c at the moment of emission.

    By way of clarification, at NO time have I ever claimed the apparent superluminal expansion of quasar jets verify higher values of c in the past. The slow-motion effect discussed earlier rules that out absolutely. The standard solution to that problem is accepted here. The accepted distance of the sheets of matter from the supernova is also not in question. That is fixed by angular measurement. What IS affected by the slow motion effect is the apparent time it took for light to get to those sheets from the supernova, and the rate at which the light-rings on those sheets grew.

    Additional Note, 1/18/99: In order to clarify some confusion on the SN1987A issue and light-speed, let me give another illustration that does not depend on the geometry of triangles etc. Remember, distances do not change with changing light-speed. Even though it is customary to give distances in light-years (LY), that distance is fixed even if light-speed c is changing.

    To start, we note that it has been established that the distance from SN1987A to the sheet of material that reflected the peak intensity of the light burst from the SN, is 2 LY, a fixed distance. Imagine that this distance is subdivided into 24 equal light-months (LM). Again the LM is a fixed distance. Imagine further that as the peak of the light burst from the SN moved out towards the sheet of material, it emitted a pulse in the direction of the earth every time it passed a LM subdivision. After 24 LM subdivisions the peak burst reached the sheet.

    Let us assume that there is no substantive change in light-speed from the time of the light-burst until the sheet becomes illuminated. Let us further assume for the sake of illustration, that the value of light-speed at the time of the outburst was 10c now. This means that the light-burst traversed the DISTANCE of 24 LM or 2 LY in a TIME of just 2.4 months. It further means that as the travelling light-burst emitted a pulse at each 1 LM subdivision, the series of pulses were emitted 1/10th month apart IN TIME.

    However, as this series of pulses travelled to earth, the speed of light slowed down to its present value. It means that the information contained in those pulses now passes our earth-bound observers at a rate that is 10 times slower than the original event. Accordingly, the pulses arrive at earth spaced one month apart in time. Observers on earth assume that c is constant since the pulses were emitted at a DISTANCE of 1 LM apart and the pulses are spaced one month apart in TIME.

    The conclusion is that this slow-motion effect makes it impossible to find the value of c at the moment of emission by this sort of process. By a similar line of reasoning, superluminal jets from quasars can be shown to pose just as much of a problem on the cDK model as on conventional theory. The standard explanation therefore is accepted here.
    http://www.setterfield.org/cdk/cdkconseq.html


    JOE MONTE
    Helen,
    Now, the speed of light was first measured at around 300,000 km.sec in 1879 by A. A. Michelson. Since then many experiments have taken place corroborating this figure. Now, this Setterfield character claims that the speed of light was infinite from the creation to the fall and, since the fall, has slowed down over time. If that were true then there would be a difference between current speed detection and experiments conducted even forty years ago- a curve similar to the curve of a tangent in quadrant I with the y-axis (asymptote) being the moment of the fall. As it is, the equation for all of the gathered assessments looks more like y=1 - a line.

    Setterfeild also shamelessly invokes the name of Einstein. Well, try as I might, I don't fully understand his (Einstein's) theory of relativity but here's the light version: Light travels approx. 300,000 km/sec in a vaccuum despite its frame of reference.. It is the absolute limit. Nothing goes faster. If I were to pilot a spacecraft and travel 100,000 km/sec (WOW) and I turned on a headlight the light emitted would travel at 300,000 km/sec and NOT 400,000 km/sec. Einstein disagrees.

    Now, the nail in the proverbial coffin. The San Diego based Institue for Creation Research rejected Setterfeild's "c" decay in 1988 after an embarrassing acceptance of it in 1981 (Acts and Facts , June 1988, G. Aardsma). Did you realize Setterfield's ad hoc hypothesis is twenty years old and since dead for thirteen years?


    HELEN
    To correct some of your misconceptions. The Setterfield model does not have the speed of light as infinite from creation to the fall. That is nowhere in his work. His current estimate is about 10^11 times now with a very rapid drop at first. The initial rapid drop is agreed upon by most, if not all, of those studying the issue at the current time, I believe. Albrecht and Magueijo (Physics Review D, 1999) and John Barrow (same issue), however, postulate the initial speed of light to be about 10^60 times its current speed. Setterfield is much more in line with Troitskii's estimation, which is much more conservative.

    Secondly, the speed of light was measured earlier than 1879. Galileo first tried to measure it using lanterns on hills. A the time it was estimated the speed of light was infinite. That view held until Roemer in 1675 who, measuring the speed of light using eclipse timings on Jupiter's moon Io, found that the speed of light was indeed fast, but finite. Hundreds of measurements have been made since then, using many different methods.

    In the meantime, Aardsma's article was flawed in its critique, and known to be flawed before it was published. It was published anyway. I consider that a major embarrassment for ICR. At a recent private presentation by Setterfield here in the United States, however, one of the representatives of ICR who is a physicist attended and was quite impressed. We will see where it goes from there.

    Aardsma's criticism about Norman and Setterfield's use of data was refuted quite ably by Alan Montgomery, a professional statistician, and Lambert Dolphin, a senior research physicist at the time at Stanford Research Institute, International and was published in Galilean Electrodynamics, a peer-reviewed journal. There has never been a refutation to that article.

    Setterfield's work not only stands, but he is starting to accept speaking invitations from around the world now that the last paper's research is finished and the model is pretty much together as a whole.

    I do not mean to be rude, but you are quite right when you say you do not understand this material. I think if you go to Setterfield's site www.setterfield.org and read what you are capable of understanding (there are two theoretical summaries there, one I wrote with his help last February when I was in Australia working on the paper with him), then maybe you can come back with a bit nicer attitude about the work?


    THINKPLEASE
    From what I understand, some cosmological theories suggest a possible large change in c after the last cosmological phase change (from plasma to particles). Other than that, I am unaware of any possible change in c past that epoch. I would welcome a refereed reference.


    HELEN
    Here are some links which might help:

    http://arXiv.org/PS_cache/astro-ph/pdf/0007/0007108.pdf
    -- Charge Conservation and time varying speed of light
    http://www.ldolphin.org/dethrone.html
    Maguiejo interview
    http://www.newscientist.com/ns/19990724/isnothings.html
    John Barrow New Scientist Article, “Is Nothing Sacred?”
    http://www.varsity.utoronto.ca/archives/120/oct07/scitech/faster.html -- John Moffat’s take
    http://xxx.lanl.gov/abs/astro-ph/9811018 -- A&M paper, Physics Review D
    http://xxx.lanl.gov/abs/astro-ph/9811022 -- Barrow paper from Physics Review D
    http://xxx.lanl.gov/abs/astro-ph/9907340
    Astrophysical Probes of the Constancy of the Velocity of Light
     
Loading...