Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Many worlds busted?


Recommended Posts

I'm not sure why we are still arguing this?

Everett new that he has issue with Born rule and tried to modify it. In fact everyone tried to do so in order to make MWI work - and I see no point in doing that - trying to modify basic rule supported by all evidence so far.

Even I proposed one solution - which is easily shown to be mathematically inconsistent (expecting all probabilities to be expressed as rational number while QM allows for irrational numbers to be probabilities).

On wiki there is even paragraph addressing this - Frequentism.

To my eyes, MWI is clearly flawed and should not be taken seriously. However, that goes against the fact that is is generally accepted as plausible. But then again, we have this:

qmpoll.jpg

where Copenhagen interpretation carries 42% votes.

I'm not sure how can anyone take Copenhagen interpretation seriously after they hear about decoherence?

 

Link to comment
Share on other sites

6 minutes ago, andrew s said:

@vlaiv you might find this discussion interesting mwi-with-probability-of-outcomes Regards Andrew

Oh, I love this quote from that page:

Quote

Carroll has suggested that the Born Rule is the credence a rational observer assigns to one outcome or another occurring--to the observer finding herself in one branch/universe or another--after the measurement is complete, but before she looks at the detector reading. I guess that would be part of the meaning he wants to assign it.

Source https://www.physicsforums.com/threads/compatibility-of-mwi-with-probability-of-outcomes.991232/

Want to escape very uncomfortable position - anthropic principle to the rescue!

Link to comment
Share on other sites

I was just thinking - motivation for interpretation of QM is among other things "wave function collapse", or rather explaining how after decoherence has happened and superposition is lost and we are left with "ensemble of states" - we end up in one particular state.

We have explanation why macroscopic world is not in superposition, yet we don't have answer why spin up was "selected" in this round.

But do we really want that answer?

Finding mechanism for that will put us in completely deterministic world. In order to live in probabilistic world we simply can't have other explanation except - it happens at random.

I see the appeal of MWI to scientists.

- It is deterministic - there is formula for everything and it is 100% correct

- It kind of explains why we think we live in probabilistic world (if we skip these details about Born rule)

- It allows for free will - as in fact, not only you have freedom to take any choice - you will take every choice! :D

I think I need to get better understanding of dynamic of decoherence - in that process what is the evolution of states and how long does it take to transition from superposition to ensemble of eigenstates and if there is anything there that could lead to "moment" of "choice" if not "reason" for "choice".

 

 

  • Like 1
Link to comment
Share on other sites

21 minutes ago, vlaiv said:

I'm not sure how can anyone take Copenhagen interpretation seriously after they hear about decoherence?

As I understand it while decoherence has taken us forward, particularly the transition to the classical world, it has not solved the "measurement problem".

I freely admit I don't fully grasp the measurement problem.

Regards Andrew 

Edited by andrew s
Link to comment
Share on other sites

Just now, andrew s said:

As I understand it while decoherence has taken us forward, particularly the transition to the classical world, it has not solved the "measurement problem".

I freely admit I don't fully grasp measurement problem.

Regards Andrew 

It has solved measurement problem - it just did not provide us with mechanism for "determining".

I believe it to be following, and I'll use Schrodinger's cat here because that is the reason why it is sentenced to death in the first place :D

Usual reasoning goes:

We have an atom that is about to decay, and it is in superposition of two states - decayed and not decayed. There is a vial of poison connected to detector and hammer or whatever and it is consequently in superposition of broken and whole states and eventually we come to cat - that is in superposition of dead and alive, but when we open the box, we always find it in dead or alive state and never in superposition.

Decoherence tells us something different is going on here.

Atom is in superposition of decayed and not decayed states and that is fine. Now it interacts with detector and becomes entangled with detector and this means that it is not "detector is in super position of detected and undetected" states, but rather proper assessment of situation would be:

System is in "superposition" of following states "atom has decayed and detector is triggered" state and "atom has not decayed and detector is not triggered" state. We have lost "cross" terms (there are no terms for atom decayed but detector not triggered and atom not decayed and detector triggered). Another important thing to note is that detector consists out of enormous number of particles - each having wave function so detectors wave function is extremely complex with may phase terms.

Once our atom becomes entangled with detector and all that thermal noise - it goes "out of phase with itself" - or rather resulting two states written above are out of phase.

Photons interfere with themselves in dual slit experiment because they maintain phase. Imagine having photon with one set of frequencies and with another set of frequencies (not single frequency any more since we now have bunch of particles in the system and state is composed out of all their states). Can such photon interfere with itself to clearly see interference pattern or will it just behave very uniformly random - like a particle?

This is decoherence - that going out of phase prevents states: "atom has decayed and detector is triggered" and "atom has not decayed and detector is not triggered" to interfere and form superposition - you are left with "ensemble" sort of "superposition" - where you have either of two pure states with their respective probabilities.

This is why we don't ever see cat in superposition of dead and alive states, yet we have certain probability of finding it in each state when we open.

Problem that is not solved by this or any other interpretation - to my knowledge is why is particular state being observed and not the other one - what made it become reality out of two possibilities and when did it happen in the course of decoherence?

Link to comment
Share on other sites

2 hours ago, vlaiv said:

We have biased coin that lands heads one million times more likely than tails.

We toss that coin two times.

If there is two worlds split per coin toss and we toss the coin two times, we will end up with 4 resulting copies of universe.

hh, ht, th, tt

Actually, it is very likely we won't.  If the coin is biased as you say it is most likely we would end up with 4 worlds, hh, hh, hh, hh.  Therefor it is vastly more probable that we will end up in a hh world, but there is equal liklihood as to which specific world we land in.

Michael

 

Link to comment
Share on other sites

1 minute ago, Synchronicity said:

Therefor it is vastly more probable that we will end up in a hh world, but there is equal liklihood as to which specific world we land in.

Who we? First copy, second copy, third copy or fourth copy?

We can be any of them with equal probability, unless "we" we are somehow special.

Is there anything special about hh copy of us?

Link to comment
Share on other sites

Thanks @vlaiv, I think I have a fair grasp of the above however, many still argue it has not been solved.

For example "One often hears the claim that decoherence solves the measurement problem of quantum mechanics (see the entry on philosophical issues in quantum theory). Physicists who work on decoherence generally know better, but it is important to see why even in the presence of decoherence phenomena, the measurement problem remains or in fact gets even worse."

They state the problem thus "If instead we assume that quantum mechanics is itself applicable to the description of measurements, then the question becomes one of how one should model a measurement within quantum theory, specifically as some appropriate interaction between a ‘system’ and an ‘apparatus’, and of whether by so doing one can derive from the unitary evolution for the total system of system and apparatus the three phenomenological aspects of quantum measurements: that measurements have results, that these results obtain with some characteristic probabilities, and that depending on the result of a measurement the state of the system is generally transformed in a characteristic way (for this subdivision of the problem, see Maudlin 1995). This derivation, however, appears to be impossible."

Both from here

Regards Andrew

 

Link to comment
Share on other sites

1 hour ago, vlaiv said:

I'm not sure why we are still arguing this?

Because it's fascinating!

1 hour ago, vlaiv said:

To my eyes, MWI is clearly flawed and should not be taken seriously. However, that goes against the fact that is is generally accepted as plausible. But then again, we have this:

I couldn't agree more.  The numbers of 'split' universes we'd have after an hour when there's about 1.5×1053 kg of matter in the known universe.  Every new universe has the same number of splits every hour.
Now consider how many that could be after about 14 BN years.  To me that's just irrational - especially when the same physicists will tell you that matter and energy cannot be created or destroyed!

I think the answer is around that 'special' moment of time we call now.  All the potentials are there and the possibilities calculate out as MW theory suggests until now happens.  After that there is one actual universe - the same one with the event now part of history and unchangable.  That's why we can't go back in time - we wouldn't know which choices were made.

Michael

Link to comment
Share on other sites

@andrew s

I think that third aspect is not problematic, or rather that it resolves outside of QM domain.

It is related to the fact that once measurement is made, evolution of QM system "starts from that point" and proper probability distributions expect system to be in "prepared" state. This is fundamental part of "wave function collapse".

However, we can see that once QM system transitions into "ensemble" domain - it is no longer in quantum superposition - regular probability rules apply. Then we no longer need to model problem as QM problem, but rather can model problem as probability problem.

For this reason, I would suggest we look at following: imagine we have regular coin and we do two tosses with it - but we do it in two different ways.

1. we toss coin twice in a row but we don't look at intermediate results - we let automatic "score keeper" to record results. We are expected to calculate probabilities of different "total" outcomes, and we are right to say:

1/4 to each hh, ht, th and tt.

2. we toss coin once, and we look at it, record the value of it and want to give final prediction before we toss it next time. Now we can clearly see that we no longer can predict final outcome as being 1/4 for each hh, th, ht and tt if we recorded that after first toss coin landed heads - we need to modify our prediction to 1/2 hh or ht.

Decoherence deals with measurement problem because it prevents superposition of complex entangled states once measurement is made and reduces things to normal probabilities. Once system decoheres it is effectively in one of eigenstates, because it can't be superposition of them any more. It is just that we can't tell with certainty which eigenstate it will be and our prediction can only give us classical probabilities of what that state can be.

 

Link to comment
Share on other sites

I just realized that I too subscribe to Copenhagen interpretation :D

Well not original one that treats measuring aparatus to be classical system as opposed to quantum system, but I also believe there is "quantum function collapse" - in a sense.

I would not call it collapse, I would actually describe it to be something else - transition from superposition of states into ensemble of states. Difference being - superposition of states, let's say 1/2 up + 1/2 down is real and system is in this complex dual state where spin is not having one definitive value or can be just due to our lack of knowledge - it is either up or is either down but we don't know which one, so we ascribe 1/2 probability to each.

Measurement or rather decoherence causes this transition. In that sense - it is wave function "collapse"

Link to comment
Share on other sites

29 minutes ago, vlaiv said:

 

It is related to the fact that once measurement is made, evolution of QM system "starts from that point" and proper probability distributions expect system to be in "prepared" state. This is fundamental part of "wave function collapse ".

I think this is the core of the issue. 

The second quote above point out that if you model the whole measurement apparatus and the measured object using QM . The whole just undergoes Unitary Evolution and so how does the measurement (and associated collapse) be accounted for.

You describe just the evolution of the measured object. The measurement problem is how to describe the whole measurement  process in QM and is I think still being researched.

5 minutes ago, vlaiv said:

I just realized that I too subscribe to Copenhagen interpretation :D

Well not original one that treats measuring aparatus to be classical system as opposed to quantum system, but I also believe there is "quantum function collapse" - in a sense.

I would not call it collapse, I would actually describe it to be something else - transition from superposition of states into ensemble of states. Difference being - superposition of states, let's say 1/2 up + 1/2 down is real and system is in this complex dual state where spin is not having one definitive value or can be just due to our lack of knowledge - it is either up or is either down but we don't know which one, so we ascribe 1/2 probability to each.

Measurement or rather decoherence causes this transition. In that sense - it is wave function "collapse"

I think this is not quite how QM would put it for a number of reasons. Say we know for sure, as we just measured it , it is 1/2 up, repeat measurement confirms this.

So we know the state of the system and we now measure at 45° . We 1/2 up ot 1/2 down in the 45° basis. No lack of knowledge just the way QM is.

Regards Andrew 

Link to comment
Share on other sites

45 minutes ago, Synchronicity said:

Everything I know about this I've learned from Sean Carroll talks - can any of you recommend anything else for the interested amateur?

Thanks

Michael

You could try Feynman's QED "the strange theory of light and matter"

Regards Andrew.

  • Thanks 1
Link to comment
Share on other sites

55 minutes ago, Synchronicity said:

Everything I know about this I've learned from Sean Carroll talks - can any of you recommend anything else for the interested amateur?

Thanks

Michael

I think that good start would be Theoretical Minimum by Leonard Susskind

https://theoreticalminimum.com/home

There are nice lecture videos on youtube channel and you can use website to navigate to particular lectures in the series.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.