ArsTechnica

Register Log in

Scientific Method / Science & Exploration

With many-worlds, all quantum mechanics is local

But that doesn't make for evidence that parallel universes exist.

In the many-worlds interpretation of quantum mechanics, Schrödinger's cat is both alive and dead—in different universes.

Quantum nonlocality, perhaps one of the most mysterious features of quantum mechanics, may not be a real phenomenon. Or at least that’s what a new paper in the journal PNAS asserts. Its author claims that nonlocality is nothing more than an artifact of the Copenhagen interpretation, the most widely accepted interpretation of quantum mechanics.

Nonlocality is a feature of quantum mechanics where particles are able to influence each other instantaneously regardless of the distance between them, an impossibility in classical physics. Counterintuitive as it may be, nonlocality is currently an accepted feature of the quantum world, apparently verified by many experiments. It’s achieved such wide acceptance that even if our understandings of quantum physics turn out to be completely wrong, physicists think some form of nonlocality would be a feature of whatever replaced it.

The term “nonlocality” comes from the fact that this “spooky action at a distance,” as Einstein famously called it, seems to put an end to our intuitive ideas about location. Nothing can travel faster than the speed of light, so if two quantum particles can influence each other faster than light could travel between the two, then on some level, they act as a single system—there must be no real distance between them.

The concept of location is a bit strange in quantum mechanics anyway. Each particle is described by a mathematical quantity known as the "wave function." The wave function describes a probability distribution for the particle’s location, but not a definite location. These probable locations are not just scientists’ guesses at the particle’s whereabouts; they’re actual, physical presences. That is to say, the particles exist in a swarm of locations at the same time, with some locations more probable than others.

A measurement collapses the wave function so that the particle is no longer spread out over a variety of locations. It begins to act just like objects we’re familiar with—existing in one specific location.

The experiments that would measure nonlocality, however, usually involve two particles that are entangled, which means that both are described by a shared wave function. The wave function doesn’t just deal with the particle’s location, but with other aspects of its state as well, such as the direction of the particle’s spin. So if scientists can measure the spin of one of the two entangled particles, the shared wave function collapses and the spins of both particles become certain. This happens regardless of the distance between the particles.

The new paper calls all this into question.

The paper’s sole author, Frank Tipler, argues that the reason previous studies apparently confirmed quantum nonlocality is that they were relying on an oversimplified understanding of quantum physics in which the quantum world and the macroscopic world we’re familiar with are treated as distinct from one another. Even large structures obey the laws of quantum Physics, Tipler points out, so the scientists making the measurements must be considered part of the system being studied.

It is intuitively easy to separate the quantum world from our everyday world, as they appear to behave so differently. However, the equations of quantum mechanics can be applied to large objects like human beings, and they essentially predict that you’ll behave just as classical physics—and as observation—says you will. (Physics students who have tried calculating their own wave functions can attest to this). The laws of quantum physics do govern the entire Universe, even if distinctly quantum effects are hard to notice at a macroscopic level.

When this is taken into account, according to Tipler, the results of familiar nonlocality experiments are altered. Typically, such experiments are thought to involve only two measurements: one on each of two entangled particles. But Tipler argues that in such experiments, there’s really a third measurement taking place when the scientists compare the results of the two.

This third measurement is crucial, Tipler argues, as without it, the first two measurements are essentially meaningless. Without comparing the first two, there’s no way to know that one particle’s behavior is actually linked to the other’s. And crucially, in order for the first two measurements to be compared, information must be exchanged between the particles, via the scientists, at a speed less than that of light. In other words, when the third measurement is taken into account, the two particles are not communicating faster than light. There is no "spooky action at a distance."

Tipler has harsh criticism for the reasoning that led to nonlocality. “The standard argument that quantum phenomena are nonlocal goes like this,” he says in the paper. “(i) Let us add an unmotivated, inconsistent, unobservable, nonlocal process (collapse) to local quantum mechanics; (ii) note that the resulting theory is nonlocal; and (iii) conclude that quantum mechanics is [nonlocal].”

He’s essentially saying that scientists are arbitrarily adding nonlocality, which they can’t observe, and then claiming they have discovered nonlocality. Quite an accusation, especially for the science world. (The "collapse" he mentions is the collapse of the particle’s wave function, which he asserts is not a real phenomenon.) Instead, he claims that the experiments thought to confirm nonlocality are in fact confirming an alternative to the Copenhagen interpretation called the many-worlds interpretation (MWI). As its name implies, the MWI predicts the existence of other universes.

The Copenhagen interpretation has been summarized as “shut up and measure.” Even though the consequences of a wave function-based world don’t make much intuitive sense, it works. The MWI tries to keep particles concrete at the cost of making our world a bit fuzzy. It posits that rather than becoming a wave function, particles remain distinct objects but enter one of a number of alternative universes, which recombine to a single one when the particle is measured.

Scientists who thought they were measuring nonlocality, Tipler claims, were in fact observing the effects of alternate universe versions of themselves, also measuring the same particles.

Part of the significance of Tipler’s claim is that he’s able to mathematically derive the same experimental results from the MWI without use of nonlocality. But this does not necessarily make for evidence that the MWI is correct; either interpretation remains consistent with the data. Until the two can be distinguished experimentally, it all comes down to whether you personally like or dislike nonlocality.

Tipler himself is a controversial figure in the scientific community. He’s been called a crackpot by Astrophysicist Sean Carroll for his science fiction-like claim that life will evolve to become omnipotent in the moment before the end of the Universe. He’s also denied climate change and explored scientific mechanisms for the resurrection of the dead, getting him accused of engaging in pseudoscience by many in the scientific community.

He does have his defenders, such as physicist David Deutsch, who builds on some of Tipler’s work, though Deutsch rejects Tipler’s metaphysical conclusions. And even Carroll acknowledges that Tipler did good scientific work in his early career. That being the case, is Tipler’s new paper to be taken seriously?

In science, it’s not the reputation of the scientist that determines the validity of his or her work; it’s whether the work can be born out by evidence. And right now, that’s simply not possible here.

PNAS, 2014. DOI: 10.1073/pnas.1324238111  (About DOIs).

Expand full story

91 Reader Comments

Post a reply
  1. Hi, Xaq Rzetelny. You state in your present Ars Technica article that it's "simply not possible here" "[that physicist and mathematician Prof. Frank J. Tipler's current paper] can be born out by evidence."

    If one accepts the validity of General Relativity (which has been confirmed by every experiment to date), then nonlocality does not exist, since the speed of light is the fastest anything can travel, and therefore the multiverse of the Many-Worlds Interpretation logically must exist (i.e., due to the reason given in Prof. Tipler's present paper).

    But beyond any experimental tests, what Prof. Tipler's paper "Quantum nonlocality does not exist" (Proceedings of the National Academy of Sciences of the United States of America, published online before print on July 11, 2014, doi:10.1073/pnas.1324238111 , PubMed ID: 25015084) demonstrates is that a large portion of the physics community has falsely and unthinkingly assumed that experimental confirmations of quantum entanglement meant that nonlocality is real. Tipler's said paper shows that that assumption doesn't follow. So beyond whether a many-worlds or single-world interpretation of Qauntum Mechanics can be experimentally confirmed, Tipler's aforementioned paper is invaluable in clearing away the miasma of befuddled thinking that has long lain over the physics community regarding this subject.

    Moreover, if Quantum Mechanics is true, then the multiverse's existence follows as a mathematically-unavoidable consequence. For the details, see Frank J. Tipler, The Physics of Immortality: Modern Cosmology, God and the Resurrection of the Dead (New York, NY: Doubleday, 1994), pp. 483-488.

    And the existence of the multiverse can be experimentally confirmed: see Frank J. Tipler, "Testing Many-Worlds Quantum Theory By Measuring Pattern Convergence Rates", arXiv:0809.4422, Sept. 25, 2008; and Frank Tipler, "Experimentally Testing the Mulitverse/Many-Worlds Theory", American Astronomical Society 224th Meeting, June 1-5, 2014, #304.01 (June 4), bibcode: 2014AAS...22430401T .

    For my reply to Dr. Sean M. Carroll's erroneous criticisms of Prof. Tipler in Carroll's blog post "The Varieties of Crackpot Experience" (Discover Blogs; and Preposterous Universe, Jan. 5, 2009), see WebCite: 5yDcRx6IZ and Archive.Today: 56z3C.

    Nor has Prof. Tipler ever denied Climate Change. The climate is in constant flux, and Tipler acknowledges that fact. Rather, Tipler quite correctly rejects the theory of Anthropogenic Global Warming (AGW), which has been repeatedly experimentally falsified.

    It's very unfortunate that AGW isn't true, as life loves a warm, carbon dioxide-rich Earth. It would be quite a life-giving boon to humanity and the other creatures if AGW had been true.

    Regarding Prof. Tipler's Omega Point cosmology, which is a proof of God's existence, it is now a mathematical theorem per the known laws of physics (viz., the Second Law of Thermodynamics, General Relativity, and Quantum Mechanics), of which have been confirmed by every experiment conducted to date. Hence, the only way to avoid the Omega Point cosmology is to reject empirical science. As Prof. Stephen Hawking wrote, "one cannot really argue with a mathematical theorem." (From p. 67 of Stephen Hawking, The Illustrated A Brief History of Time [New York, NY: Bantam Books, 1996; 1st ed., 1988].) The Omega Point cosmology has been published and extensively peer-reviewed in leading physics journals.

    Additionally, we now have the Feynman-DeWitt-Weinberg quantum gravity/Standard Model Theory of Everything (TOE) required by the known laws of physics and that correctly describes and unifies all the forces in physics: of which inherently produces the Omega Point cosmology. So here we have an additional high degree of assurance that the Omega Point cosmology is correct. For much more on the Omega Point TOE, see my following article: James Redford, "The Physics of God and the Quantum Gravity Theory of Everything", Social Science Research Network (SSRN), Sept. 10, 2012 (orig. pub. Dec. 19, 2011), 186 pp., doi:10.2139/ssrn.1974708.
    2 posts | registered
  2. wyrmhole wrote:
    steelgrass wrote:
    Have I got that right? If so, "Many Worlds" is an appalling name since it does say there is more than one "world".


    Yeah, it's a really bad name, which mislead me when I first heard it like I'm sure it has many people. It might be better to call it "Everett" after its originator, like Sean Carroll does most of the time.


    This is an issue with the conversion of knowledge between experts and lay people. The expert gives an analogy to demonstrate one concept and the lay person assumes the analogy is a model that they can expand on and interpret.

    As an example, I used an analogy to try and explain how a CPU and RAM effect speed in a computer. The example is that the amount of RAM is like how much food you can put in your mouth at once and the CPU is how fast you can chew. It is a simple analogy to help people understand one small piece/concept of a computer. Going to school and getting a degree in oral health won't help you understand a CPU - the analogy has limits.

    Yet, many people try and extend an expert's simple analogy into a reflection of the entire model - like asking how having braces will impact your computer...these "extensions" of analogies get picked up and reported and disseminated. A big part of science is knowing that models have limits and analogies are not models.

    People like to feel they understand things and that something really complicated can be made really simple - makes the world less scary.

    TL;DR - an analogy is not a model.
    2 posts | registered
  3. wyrmhole wrote:
    contrasia wrote:
    contrasia wrote:
    Quote:
    Scientists who thought they were measuring nonlocality, Tipler claims, were in fact observing the effects of alternate universe versions of themselves, also measuring the same particles.


    I don't mind this interpretation. On the one hand it takes away some awesome ideas for the future (The idea of communicating information across vast distances, say the universe to other planets, instantly), but on the other it means we instead could communicate to alternate universes of ourselves.

    I mean, if one updates to reflect another that we manipulate, and they're from alternate universes, couldn't we use it like morse code? Or even eventually communicate stored information across them to our other selves? If we had that when Einstein was around, could you imagine him debating and discussing his own ideas with different versions of himself that are coming up with different ideas? This kind of thing would lead to the ultimate think tank, allowing possibly faster growth in scientific theory or in solving mathmatical problems.

    .... though I do admit, I prefer the idea of being able to communicate information instantly across the universe. Perhaps one day we could utilise it to teleport, or even transmit video feeds of distant planets? Or to discover and tell alien races how to build a device that will allow us to teleport, or communicate more easily. Who knows.


    Just a quick note: entanglement never has and never will permit faster-than-light information exchange. That's not how it works. In the same way, neither will this interpretation permit communication between universes. Sorry to blow your sci-fi bubble! :)

    (quick explanation: it is never possible to chose the initial state of two entangled particles. Thus, which particle is 0 and which particle is 1 will always be random, rendering communication impossible)

    Edit: missed a word or two


    Ah awesome, thanks for clearing that up. Though I'm a little confused now, as mentioned:

    Quote:
    Nothing can travel faster than the speed of light, so if two quantum particles can influence each other faster than light could travel between the two, then on some level, they act as a single system—there must be no real distance between them.


    My understanding of that was that rather than travelling FTL, it's actually in two or more different locations at the same time. It doesn't necessarily just exist in one place, but can exist multiple times within the same moment of time, in the same reality. I've always thought I understood that as to be fact, as whenever they do the experiment where they fire them off onto a sheet, you'll sometimes see it multiple times rather than a single one, and that was always their explanation of it. I therefore assumed if it is the exact same quantum bit existing in multiple locations, then if you were to change one of them, the others would also change because they are in fact the exact same bit.


    You will never see (e.g.) an electron at more than one place on a detector screen. If you fire one electron, you always see one electron at the detector. The difference only shows up when you do many such measurements and look at the statistical distribution.

    And if you did this in a way that the electron had multiple paths it could take (the classic double slit experiment) then what is seen is that the resulting distribution looks like an interference pattern between the two possible paths.

    If you instead add an additional detector to determine which of the two slits the electron passed through before hitting the screen, then the interference pattern disappears and the distribution looks like the "classical" expectation.

    But either way it's always one electron in one place, with a probability determined by the rules of QM. It's what those rules mean about what's actually happening to the electron that is such a head-scratcher.

    Quote:
    edit: If it is the exact same bit, rather than manipulate two entangled bits, can't you just affect the single bit in some manner so it changes? If not now, maybe eventually? @_@


    Only if we discover that QM is wrong in such a way that it is allowed, and it would be a rather big change. It's not a matter of technology. It's what the underlying physics is doing. Entanglement is not about some "connection" between two objects such that whatever happens to one is somehow reflected in the other even after separated. Entanglement is the consequence of some interaction which preserves some quantity, such that their two values must be correlated. If you then interact with one of those two particles to change that property, then there simply won't be any correlation between the two anymore.

    For example, consider two billiard balls, a cue ball and the 8 ball. The cue ball is shot towards the 8 ball and they impact. Whatever direction and velocity they end up moving at, their total momentum must be conserved. So if you know the cue ball's momentum before impact, and were to measure the cue ball's momentum after the impact, you could easily deduce what the momentum of the 8 ball must be. And someone who measured the 8 ball's momentum after impact would know what the cue ball's momentum must be. And if you compared your measurements, you'd see that yes indeed they line up.

    Now imagine that after the collision, you picked up the cue ball and tossed it across the room. Would the other person measure a momentum for their 8 ball that reflects your action? No. At that point all it would mean is that the 8 ball and cue ball's momentums were unrelated.

    That's all entanglement is, and that's why it can't be used to transfer information.

    The only way in which it is different, the reason it gets spoken of as such a mystery, is because it obeys quantum statistics and so presents the same interpretational conundrum as the double slit experiment.

    If you use the Copenhagen interpretation, then the particle doesn't actually have any specific momentum (or specific path through the two slits) until measured, and that measurement causes a "collapse" to a single state which is "chosen" at that instant. Somehow, when the other particle is also measured and its wavefunction collapsed, it will "choose" whatever value of momentum is needed to balance, even if the two measurements occur too closely together for any light-speed signal to traverse between them. This is what it means to be "non-local". It means "obeys the speed of light, which is the speed of causality in Relativity".

    In Many Worlds, there is no collapse, the wavefunction continues in a way where all momentum-preserving combinations of state for the two particles are still present. There is no special significance to your subsequent measurement, so it doesn't matter when you and your partner measure your particles. They will correlate because only correlating values are left in the wavefunction.


    I didn't think entanglement was actually part of the issue, as that's used for a different thing. I was thinking more along the lines that if you had a mirror in front of you, and a mirror behind you, and you threw a single ball in the air, you'd see the same ball in multiple different positions react the exact same way. The idea that the ball could be multiplied in such a way across vast distances, and still be the exact same ball, so whatever you did to one, the others would do the exact same thing because it's still the same ball just in different locations.

    You're still talking about experiments that utilise more than one ball (Entanglement), but that's not what i'm talking about or interpreting.

    I'm not sure what experiment it was I saw with the sheet and the multiple dots, as I only caught the program mid-way (an unfortunate vague understanding). I have no idea if it was an electron, or a photon, or what physical implementation of an experiment of a quantum bits it was. =_="

    Thanks for the awesome explanations btw, and for your time to explain them ^-^;

    edit: adjusted "away" to "a way" and "qubit" to "quantum bits"
    7 posts | registered
  4. wyrmhole wrote:
    contrasia wrote:
    contrasia wrote:
    Quote:
    Scientists who thought they were measuring nonlocality, Tipler claims, were in fact observing the effects of alternate universe versions of themselves, also measuring the same particles.


    I don't mind this interpretation. On the one hand it takes away some awesome ideas for the future (The idea of communicating information across vast distances, say the universe to other planets, instantly), but on the other it means we instead could communicate to alternate universes of ourselves.

    I mean, if one updates to reflect another that we manipulate, and they're from alternate universes, couldn't we use it like morse code? Or even eventually communicate stored information across them to our other selves? If we had that when Einstein was around, could you imagine him debating and discussing his own ideas with different versions of himself that are coming up with different ideas? This kind of thing would lead to the ultimate think tank, allowing possibly faster growth in scientific theory or in solving mathmatical problems.

    .... though I do admit, I prefer the idea of being able to communicate information instantly across the universe. Perhaps one day we could utilise it to teleport, or even transmit video feeds of distant planets? Or to discover and tell alien races how to build a device that will allow us to teleport, or communicate more easily. Who knows.


    Just a quick note: entanglement never has and never will permit faster-than-light information exchange. That's not how it works. In the same way, neither will this interpretation permit communication between universes. Sorry to blow your sci-fi bubble! :)

    (quick explanation: it is never possible to chose the initial state of two entangled particles. Thus, which particle is 0 and which particle is 1 will always be random, rendering communication impossible)

    Edit: missed a word or two


    Ah awesome, thanks for clearing that up. Though I'm a little confused now, as mentioned:

    Quote:
    Nothing can travel faster than the speed of light, so if two quantum particles can influence each other faster than light could travel between the two, then on some level, they act as a single system—there must be no real distance between them.


    My understanding of that was that rather than travelling FTL, it's actually in two or more different locations at the same time. It doesn't necessarily just exist in one place, but can exist multiple times within the same moment of time, in the same reality. I've always thought I understood that as to be fact, as whenever they do the experiment where they fire them off onto a sheet, you'll sometimes see it multiple times rather than a single one, and that was always their explanation of it. I therefore assumed if it is the exact same quantum bit existing in multiple locations, then if you were to change one of them, the others would also change because they are in fact the exact same bit.


    You will never see (e.g.) an electron at more than one place on a detector screen. If you fire one electron, you always see one electron at the detector. The difference only shows up when you do many such measurements and look at the statistical distribution.

    And if you did this in a way that the electron had multiple paths it could take (the classic double slit experiment) then what is seen is that the resulting distribution looks like an interference pattern between the two possible paths.

    If you instead add an additional detector to determine which of the two slits the electron passed through before hitting the screen, then the interference pattern disappears and the distribution looks like the "classical" expectation.

    But either way it's always one electron in one place, with a probability determined by the rules of QM. It's what those rules mean about what's actually happening to the electron that is such a head-scratcher.

    Quote:
    edit: If it is the exact same bit, rather than manipulate two entangled bits, can't you just affect the single bit in some manner so it changes? If not now, maybe eventually? @_@


    Only if we discover that QM is wrong in such a way that it is allowed, and it would be a rather big change. It's not a matter of technology. It's what the underlying physics is doing. Entanglement is not about some "connection" between two objects such that whatever happens to one is somehow reflected in the other even after separated. Entanglement is the consequence of some interaction which preserves some quantity, such that their two values must be correlated. If you then interact with one of those two particles to change that property, then there simply won't be any correlation between the two anymore.

    For example, consider two billiard balls, a cue ball and the 8 ball. The cue ball is shot towards the 8 ball and they impact. Whatever direction and velocity they end up moving at, their total momentum must be conserved. So if you know the cue ball's momentum before impact, and were to measure the cue ball's momentum after the impact, you could easily deduce what the momentum of the 8 ball must be. And someone who measured the 8 ball's momentum after impact would know what the cue ball's momentum must be. And if you compared your measurements, you'd see that yes indeed they line up.

    Now imagine that after the collision, you picked up the cue ball and tossed it across the room. Would the other person measure a momentum for their 8 ball that reflects your action? No. At that point all it would mean is that the 8 ball and cue ball's momentums were unrelated.

    That's all entanglement is, and that's why it can't be used to transfer information.

    The only way in which it is different, the reason it gets spoken of as such a mystery, is because it obeys quantum statistics and so presents the same interpretational conundrum as the double slit experiment.

    If you use the Copenhagen interpretation, then the particle doesn't actually have any specific momentum (or specific path through the two slits) until measured, and that measurement causes a "collapse" to a single state which is "chosen" at that instant. Somehow, when the other particle is also measured and its wavefunction collapsed, it will "choose" whatever value of momentum is needed to balance, even if the two measurements occur too closely together for any light-speed signal to traverse between them. This is what it means to be "non-local". It means "obeys the speed of light, which is the speed of causality in Relativity".

    In Many Worlds, there is no collapse, the wavefunction continues in a way where all momentum-preserving combinations of state for the two particles are still present. There is no special significance to your subsequent measurement, so it doesn't matter when you and your partner measure your particles. They will correlate because only correlating values are left in the wavefunction.


    You made my day!
    I always tought that entanglement was "noting more" that knowing that two photons generated by a particular event will have opposed spin. Mesure one, and instantly get the other.
    But, heck, find a resource online that is so honest(or so accurate) to tell you what you told here.
    It's time to find something on the history of all this quantum mechanics thing and read about different interpretations.
    8 posts | registered
  5. Hi, Xaq Rzetelny. You state in your present Ars Technica article that it's "simply not possible here" "[that physicist and mathematician Prof. Frank J. Tipler's current paper] can be born out by evidence."

    If one accepts the validity of General Relativity (which has been confirmed by every experiment to date), then nonlocality does not exist, since the speed of light is the fastest anything can travel, and therefore the multiverse of the Many-Worlds Interpretation logically must exist (i.e., due to the reason given in Prof. Tipler's present paper).
    (...)


    Account freshly created just for one post... This post reeks of fanatism and religious repetition of Tipler's work. Couldn't you at least be subtle in your evangelism?

    I'm not saying that Tipler is right or wrong. I'm saying that your post looks like it was made by a trainee in a marketing department... If I just take the paragraph quoted above, you're trying to apply general relativity in a context known to be outside its scope. Using the word "logically" somewhere in your argument doesn't make it proof of anything.
    179 posts | registered
  6. t4ng3nt wrote:
    Imagine a stick that goes from your house on earth and extends to your friend's house in the Andromeda galaxy. You stand on one end of the stick and your friend stands on the other. You push the stick forward and he feels the pressure on his finger. He pushes back and you feel the push instantly!


    I thought about this idea when I was in high school. Unfortunately it doesn't work. The problem is that there's no "continuously rigid" stick. Matters are composed of individual atoms, and when you push one end of the stick, you actually push some of its atoms, which then "push" neighboring atoms (via electromagnetic force), and so on. Therefore, the transmission of your push still travel at slower-than-light speed.
    205 posts | registered
  7. I am a big fan of MWI because it tickles my personal fancy.
    That said I have no objective problem with either interpretation.

    Here's an interesting blog post by the already mentioned Sean Caroll(he calls it embarrassing, but it just shows that we need some "outside" perspective first).
    Basically like the proverbial wave function, we have not found a way to collapse this graph to a single position from inside the system right now. It's all about opinion at the moment.

    Besides, even with nonlocality it is straight forward to reason why FTL transmission of information is a non-issue You have no prior knowledge of the state of the system(i.e. spin is up or down and you have no way of knowing which until after the fact).

    @t4ng3nt, hotball Wow... that argument never seems to fail to come up. It is in fact slower than light. More precisely it is the speed of sound in that material. In steel about 6km/s. Horribly slower than light.
    79 posts | registered
  8. So does the theory he propose predict the causality violation of quantum mechanics, that the observation seems to affect the past?


    Seems to me that the mere act of observation affects the observation. How do we not get stuck in an infinitely recursive observation loop?

    On a different note I've always wondered how my observation of a particle affects that particle given that my observation happens

    A) Some time after the particle did whatever it was doing
    B) By some other means that either struck or was emitted from the particle

    Wouldn't my observation affect the medium that I'm observing and not the particle/object itself? Say light reflected off of an atom or radiation emitted from a black hole. I'm not directly observing either the atom or the black hole but the light and the radiation.
    1874 posts | registered
  9. kupfernigk wrote:
    jdale wrote:
    kupfernigk wrote:
    wirespot wrote:
    kupfernigk wrote:
    The Copenhagen model has the problem of nonlocality, String theory has the problem that extra dimensions are being postulated with no experimental basis, and the many-worlds theory has the problem of entire extra realities being postulated to deal with wave/particle duality.


    Those aren't problems. That's how science works. It's perfectly ok to come up with as many simultaneous explanations as you can, even if they are incompatible. Eventually we will manage to consistently reproduce experiments that will invalidate some of them (or aspects of them) and things will sort themselves out.

    Speaking of which, please note that the essential experiments are those that invalidate theories, not those that confirm them. It's usually much easier to come up with experiments that will confirm pretty much any theory (no matter how wild) than it is to come up with stuff that will disprove one. Geocentrism vs heliocentrism is actually a very good example of this.


    Ooh, snark. I do like it when people have to sneer at me to make themselves feel big.

    I did Popper in my introductory course on theory of science, before we got into the real meat. In the past it was true that experiments could invalidate theories (e.g. Galileo observed features on planets which Aristotle supposed to be perfect spheres) but that has long ceased to be the case. "Nonlocality" is a fiddle factor to explain why experiments 1...n conclude that information cannot be transferred at greater than the speed of light, and experiment n+1 suggests that it can. Various ways around the problem of entanglement are proposed but it is basically a problem, because the proposed solutions are not very convincing. Something is wrong, but nothing is yet convincingly falsified.

    Getting out of baby steps and into the theory of real science, we can start with Kuhn's observation that Popper isn't describing real science. In that, theories accumulate data that suggest that there are problems with them, but they are not abandoned for a variety of reasons (one of which is the careers of scientists - this isn't inherently a bad thing, just human nature) so long as they give useful results, right up until a new paradigm is proposed and, in a shift that doesn't necessarily take place in a short time, the old theory is displaced by a new one. Hence my comment "are going to be falsified by a new explanation". Until the new explanation is accepted, the adherents of the old theory will not allow that all the problems they lived with for years falsified it.

    Paradigm shifts can be slowed by external social pressures. AGW looks like a typical example, where the paradigm shift between the basically stable, God-controlled Earth and one that is influenced by its biology (and not only man - grasses, trees and bacteria have affected climate too) has been generally accepted by scientists, but in a few influential countries politicians are still resisting acknowledging it.

    Newton's theory of universal gravitation was a combining theory; it showed how a number of apparently distinct phenomena (apple falls from tree, Earth goes round Sun) fitted into a single coherent scheme of explanation. The electroweak theory is a similar thing for particle physics. Newton's theory did not falsify geocentricity, it was just that once you had accepted Newton's argument, geocentricity did not make physical sense (though you could still tweak the equations of motion of the solar system for it to make mathematical sense). To falsify geocentricity convincingly you would need to find a frame of reference outside the Earth which all observers agreed was stationary, and then observe the motions of the Earth and the Sun. But Newton overcame this by providing a new and different explanation of what was actually going on.


    I have to take exception to this statement: "In the past it was true that experiments could invalidate theories (e.g. Galileo observed features on planets which Aristotle supposed to be perfect spheres) but that has long ceased to be the case."

    It's rare that a single experiment will convincingly invalidate a theory but it's certainly possible; more likely for theories that are simple of course ("planets are perfect spheres" is certainly an example of that but not all theories are complex). An invalidated theory may not immediately be replaced, especially if no one is quite sure what to replace it with, but that doesn't change the fact that experimental proof of the theory's failure will lead to its eventual downfall. (And usually you need replication of the experimental results as well, so in that sense a single experiment is insufficient, as it should be simply for probabilistic reasons.) In particularly complex topics (like quantum mechanics), the problem of coming up with a replacement can be significant. You also start to have the problem of multiple theories that are difficult to distinguish experimentally. If you wanted to say "no one knows how to experimentally distinguish between theories X and Y" that might be true for X and Y, but it doesn't discount the possibility of that changing, and certainly doesn't merit the sweeping claim that no experiments can invalidate any theories....

    As for AGW, I don't think the failure of non-scientists to accept a paradigm shift is scientifically relevant. It's relevant for politics, public discourse, policy, perhaps even logistical matters like funding, but not for science itself.


    You seem basically to be filling in the details of why science tends to evolve via paradigm shifts rather than by gradualism.
    This is only a blog, not serious work, and I am knocking these posts off in about five minutes each, so I agree there's a total lack of rigor. Perhaps I should have more clearly distinguished hypotheses (which are being falsified all the time) and theories, which nowadays are the result of many, many hypotheses. Can you think of a single actual theory in physics or chemistry which has been disproven by a single experiment in the last hundred years (as distinct from requiring modification, which happens all the time?)

    Your separation of science from its funding and the climate in which it operates I consider to be invalid. Science is a social enterprise and scientists are part of society. The political aspects of AGW affect what research is carried out and, more indirectly, the decision of scientists to undertake work in the area or to publish. In the same way, current topics of research in fundamental physics are affected by the climate of opinion, which is formed by human beings. Physics got a big boost after WW2 because politicians were impressed by the military possibilities post- Manahattan project. There was a paradigm shift in the perceived importance of physics, without which no big accelerators.


    I am in fact filling in the details of why science tends to evolve via paradigm shifts rather than by gradualism. Why? Because Kuhn's conclusions about paradigm shifts are often presented as a weakness or failure of science, but actually they are a strength. Evidence accumulates in support of a theory, and the theory naturally becomes stronger and stronger. Naturally, it requires more evidence and more convincing evidence to overthrow it. Given what we know about experimental error, that's a good thing. You don't want to throw out that understanding based on one experiment whose result actually arose through chance or error.

    As for the separation of science and society, I understand your point but disagree. Society impacts scientists, which impacts how they do science, but that doesn't make society science. This is a semantic or, at best, philosophical point, though, and it doesn't bother me to disagree on it.

    As for disproving theories, I see you now want to limit it to chemistry and physics rather than speak of science as a whole... In many fields we are not so clear about what has achieved status as a theory, so, while in principle the hypothesis/theory distinction holds up, in practice it gets as fuzzy as the definition of "species"... Overthrowing established beliefs happens all the time. E.g. in college I literally had the experience of being told a fact on Tuesday ("neurons do not divide in adult animals") and then told on Thursday it had been proven false. Which established beliefs and "facts" have achieved the level of theory? It's not worth debating.
    2753 posts | registered
  10. Wikipedia: "Tipler states that a society far in the future would be able to resurrect the dead by emulating all alternative universes of our universe from its start at the Big Bang.' Tipler likes to advance crazy ideas like this, and the one being discussed here.

    As for the "many worlds" theory of an infinite number of parallel universes, it is the nuttiest notion ever advanced by human beings.

    See my blog post "The Parallel Universes Delusion."

    http://www.futureandcosmos.blogspot.com ... usion.html
    1 post | registered
  11. Pardon my ignorance but has anyone perhaps just suggested that there are "other" Universes ... rather than "parallel" Universes?
    346 posts | registered
  12. Ryuji wrote:
    Hi, Xaq Rzetelny. You state in your present Ars Technica article that it's "simply not possible here" "[that physicist and mathematician Prof. Frank J. Tipler's current paper] can be born out by evidence."

    If one accepts the validity of General Relativity (which has been confirmed by every experiment to date), then nonlocality does not exist, since the speed of light is the fastest anything can travel, and therefore the multiverse of the Many-Worlds Interpretation logically must exist (i.e., due to the reason given in Prof. Tipler's present paper).
    (...)


    Account freshly created just for one post... This post reeks of fanatism and religious repetition of Tipler's work. Couldn't you at least be subtle in your evangelism?

    I'm not saying that Tipler is right or wrong. I'm saying that your post looks like it was made by a trainee in a marketing department... If I just take the paragraph quoted above, you're trying to apply general relativity in a context known to be outside its scope. Using the word "logically" somewhere in your argument doesn't make it proof of anything.


    Hi, RyujiWise.

    The Omega Point is omniscient, having an infinite amount of information and knowing all that is logically possible to be known; it is omnipotent, having an infinite amount of energy and power; and it is omnipresent, consisting of all that exists. These three properties are the traditional quidditative definitions (i.e., haecceities) of God held by almost all of the world's leading religions. Hence, by definition, the Omega Point is God.

    The Omega Point final singularity is a different aspect of the Big Bang initial singularity, i.e., the first cause, a definition of God held by all the Abrahamic religions.

    As well, as Stephen Hawking proved, the singularity is not in spacetime, but rather is the boundary of space and time (see S. W. Hawking and G. F. R. Ellis, The Large Scale Structure of Space-Time [Cambridge: Cambridge University Press, 1973], pp. 217-221).

    The Schmidt b-boundary has been shown to yield a topology in which the cosmological singularity is not Hausdorff separated from the points in spacetime, meaning that it is not possible to put an open set of points between the cosmological singularity and *any* point in spacetime proper. That is, the cosmological singularity has infinite nearness to every point in spacetime.

    So the Omega Point is transcendent to, yet immanent in, space and time. Because the cosmological singularity exists outside of space and time, it is eternal, as time has no application to it.

    Quite literally, the cosmological singularity is supernatural, in the sense that no form of physics can apply to it, since physical values are at infinity at the singularity, and so it is not possible to perform arithmetical operations on them; and in the sense that the singularity is beyond creation, as it is not a part of spacetime, but rather is the boundary of space and time.

    And given an infinite amount of computational resources, per the Bekenstein Bound, recreating the exact quantum state of our present universe is trivial, requiring at most a mere 10^123 bits (the number which Roger Penrose calculated), or at most a mere 2^10^123 bits for every different quantum configuration of the universe logically possible (i.e., the powerset, of which the multiverse in its entirety at this point in universal history is a subset of this powerset). So the Omega Point will be able to resurrect us using merely an infinitesimally small amount of total computational resources: indeed, the multiversal resurrection will occur between 10^-10^10 and 10^-10^123 seconds before the Omega Point is reached, as the computational capacity of the universe at that stage will be great enough that doing so will require only a trivial amount of total computational resources.

    Miracles are allowed by the known laws of physics using baryon annihilation, and its inverse, by way of electroweak quantum tunneling (which is allowed in the Standard Model of particle physics, as baryon number minus lepton number, B - L, is conserved) caused via the Principle of Least Action by the physical requirement that the Omega Point final cosmological singularity exists. If the miracles of Jesus Christ were necessary in order for the universe to evolve into the Omega Point, and if the known laws of physics are correct, then the probability of those miracles occurring is certain.

    Additionally, the cosmological singularity consists of a three-aspect structure: the final singularity (i.e., the Omega Point), the all-presents singularity (which exists at the boundary of the multiverse), and the initial singularity (i.e., the beginning of the Big Bang). These three distinct aspects which perform different physical functions in bringing about and sustaining existence are actually one singularity which connects the entirety of the multiverse.

    Christian theology is therefore preferentially selected by the known laws of physics due to the fundamentally triune structure of the cosmological singularity (which, again, has all the haecceities claimed for God in the major religions), which is deselective of all other major religions.

    For much more on the above, and for many more details on how the Omega Point cosmology precisely matches the cosmology described in the New Testament, see my following article:

    James Redford, "The Physics of God and the Quantum Gravity Theory of Everything", Social Science Research Network (SSRN), Sept. 10, 2012 (orig. pub. Dec. 19, 2011), 186 pp., doi:10.2139/ssrn.1974708.
    2 posts | registered
Post a reply

You must to comment.

Need to register for a new account?

If you don't have an account yet it's free and easy.

Register