HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
David Deutsch on Physics Without Probability

Simon Benjamin · Youtube · 56 HN points · 2 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Simon Benjamin's video "David Deutsch on Physics Without Probability".
Youtube Summary
Audio and static slides from a talk by David Deutsch 1st June 2015.

For more information on Constructor Theory, see ConstructorTheory.org.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
Oct 12, 2022 · ThomPete on Probability (1963)
Any theory that is explainable and hard to vary does not need probability.

Can you explain how probability is used in statistical mechanics in a way to contradict what I am saying?

P.S. Thanks for being a sport and talk about this in a truly respectful way.

If you want to understand where I am coming from, I can really recommend David Deutsch talk about probability here https://www.youtube.com/watch?v=wfzSE4Hoxbc

It was a big awakening for me once I realized not only where probability theory came from (not physics) and how engrained it is in our thinking.

kgwgk
The thing is that I'm not sure of what you are saying.

Can physics explain that if you put together a glass of water and a glass of ethanol you end up with a mixture of water and ethanol?

Do you agree that it's possible - but unlikely - that they don't mix?

For other comments of yours it seems that your conclusion would be "I have no way of knowing. It depends on the positions and velocities of each of the molecules and they will either mix or they won't."

I really would appreciate having a clear answer to the question "Can physics explain that if you put together a glass of water and a glass of ethanol you end up with a mixture of water and ethanol?". Don't be shy to answer "no" if that's your idea of physics!

ThomPete
I am not sure I understand the question.

"Can physics explain that if you put together a glass of water and a glass of ethanol you end up with a mixture of water and ethanol?"

Wouldn't that be chemistry (an emergent property of physics)?

But yes we can explain if it happens and how it happens, if we have a theory for it that have been tested.

If not we will do trial an error/experiments and see what happens.

kgwgk
I don't understand the answer. (Assuming it's "yes".)

What is the explanation of why they will mix? Does it involve probabilities?

Why didn't you say "I have no way of knowing. It depends on the positions and velocities of each of the molecules and they will either mix or they won't."

(By the way, if I had asked about a container of neon and a container of helium would you have also claimed that asking whether they will mix is a chemistry question?)

ThomPete
The explanation of how different molecules combine when mixed.

Has nothing to do with probability.

If you can predict that it happens exactly 10% of the time (ex. every 10th time) then you have an explanatory model that can explain why that is and you don't need statistics.

I don't think you are asking what you think you are asking.

kgwgk
What I'm asking is a quite simple physics question. I know what I'm asking - what I don't know is why you don't at least try to give a definite answer. I'm trying to make it clearer (let me know if it's not clear enough):

I have a container with two compartments, separated by a moving wall. In each side there is an equal volume of gas at normal temperature and pressure (helium and neon). What happens when the moving wall is removed?

A) The gases will mix.

B) The gases will not mix.

C) I have no way of knowing. It depends on the positions and velocities of each of the particles and they will either mix or they won't.

The usual (wrong?) answer is A. But it seems that you could (should?) answer C. Because you cannot rule out the possibility that they don't mix - you would be guessing.

Jan 05, 2021 · 56 points, 68 comments · submitted by sytelus
marta_morena_9
To speed things up: http://constructortheory.org/what-is-constructor-theory/

This is what the video talks about. I don't like this video much. It's tedious and lengthly.

Probably no reasonable human being would assume that Quantum Mechanic as we know it is the "holy grail" of physics. We came up with it within 100 years. Can we explain everything? Of course not.

To me, the fact that we have the uncertainty principle and schroedingers cat already prove that we basically built our understanding of physics on the premise that we can never understand it. We were basically throwing the towel and saying "yeah this is it". Advances are made through measurement and verification. How do you understand something that you can't measure, because the measurement changes the system?

It was pretty obvious to me when I studied physics that this is 100% not how the world works. It's a model, like everything else. In the model of quantum mechanics, which undoubtedly is one of the most impressive human achievements nonetheless, we have essentially complete uncertainty on the quantum scale. We can not peek beyond it and it doesn't give us any insights into how to peek beyond it. It is, in a way, the end of the line.

And like any end of the line in science, it needs a radically different thought or model, to overcome. I always found it amusing how physicists, who really should know better, actually believe that quantum mechanics is how the world works and that this is it. I always was with Einstein in rejecting Quantum Theory. It's a nice model. But that's really all it is. A model. This is not how the world works and we have come to see its limitations. We need something better if we ever want to leave our solar system and colonize space. Even Einstein fell into the trap of thinking that Quantum Mechanics is more than a model. "God does not throw a dice". Probably he does not, but it's the best approximation we could come up with. He should have refuted it and looked for something better instead. Who knows where we would be now.

eigenket
People have already replied to a lot of you wrote here, so I just want to add that Einstein emphatically did not reject quantum theory. Indeed he is probably one of the people who could fairly be called the creator of quantum theory.

What Einstein was rejecting with that quote is what we call the "Copenhagen Interpretation" of quantum theory. In my opinion he was quite right to do so, I do not know anyone in the quantum foundations community that takes Copenhagen very seriously any more. Einstein's qualms about Copenhagen lead the way to us understanding its flaws, and to people like John Bell putting us on a path to fixing some of those flaws.

Einstein spent a lot of time looking for "something better" than Copenhagen, sadly he did not find it, but people who came later than him did (Everett, Bohm, the quantum Bayesianists etc).

I'm going to slightly repeat what others have said here and talk about your issue with the uncertainty principle. Think about a wave travelling on the surface of an otherwise smooth swimming pool. The wave is an extended object, it is spread out. When we model the wave the fact that we don't assign it a perfectly precise "position" is a feature, not a bug of our model. Similarly I would argue that the uncertainty principle is not some barrier "blocking" us from seeing the true values of observables as we would like. It is a statement telling us that our mental model of things having a precise position and momentum at the same time is wrong, as wrong as trying to locate the wave at a precise position on the pool would be.

l33tman
You're treading down a well worn path here, Einstein was down this road as well as you note and got stuck there for the rest of his life.

Everybody in the field would WELCOME something we can measure but can't explain with QM. But there isn't, and not due to lack of trying. Cosmology aside (difficult to measure) all the measurements of femtoscale interactions and condensed matter "constants" and their corresponding QM/QFT calculations that match to like 13-15 decimal places.. there are just incredibly massive chunks of evidence in favor of QM. It's just not a "first approximation" of reality that can be thrown out easily.

Theorists and experimentalists in the field come up with new tests every year (they are my favorite papers to read!), so it's absolutely not like there is some inherent scare of not daring to contest it. It's just that you need experiments to contest well-established and working theories, you can't just go out and say "well, but I don't like it"..

By the way as I'm sure you know if you studied this kind of physics, the "it doesn't give us any insights into how to peek beyond it" isn't really true, there are falsifiable assumptions that are tested and peeked at all the time - I'm thinking on the hidden variable theorems and validations. The theorems don't assume any specific theory (QM or a new one would do). That road is a nice one to casually drive down.

aeternum
This idea seems similar to Wolfram's somewhat new hypergraph + rule idea: https://writings.stephenwolfram.com/2020/04/finally-we-may-h...
quiescant_dodo
This is a fascinating read. Thanks for sharing.
YeGoblynQueenne
>> I always found it amusing how physicists, who really should know better, actually believe that quantum mechanics is how the world works and that this is it.

Is this a reasonable stance to adopt? "All experts in a field disagree with me - how amusing". Amusing or not it suggests that perhaps you are missing something that those experts are aware of.

>> We need something better if we ever want to leave our solar system and colonize space.

The way I understand it -I am, myself, no expert in the matter- we could be colonising space right now, the challenges are primarily engineering and economic (it would cost too much to develop the necessary technology) but we are not really lacking fundamental knowledge of how to do it, e.g. generation ships could work for that purpose, in principle, but nobody is mad enough to build one, let alone ride it if one was built.

warbaker
You're familiar with Bell's Theorem, yes? QM puts significant constraints on any physical theory that might underlie it.
wruza
The uncertainty is not exclusive to QM and is a property of any ‘wavy’ system, it seems. It is inherent. Here is the example: https://m.youtube.com/watch?v=MBnnXbOM5S4 — if you want to defeat QM, attack its wave-part.

It would be nice to have much better (deterministic, measurable) physics, but some properties just emerge from a fundamental level and you can’t do much about it (monster group etc). Nature and maths don’t care about our confusion and convenience.

>should have refuted it and looked for something better instead

Afaik, the time when theories were driven by ideas is over. Today it’s petabytes of data that you can’t really argue with over a mailbox, only discuss. (I’m just a layman physicist with deep interest, but shallow understanding, so please don’t quote me on any of this)

simiones
Honestly, seeing the high-level requirements for what constructor theory defines as Possible / Impossible, I mostly expect that it will have to conclude that many of the transformations that happen in QM are Impossible (a constructor can't be conceived that could achieve those tasks with arbitrary precision and reliability).

Related to your observations, I believe you are absolutely right in that it's likely we've sort of reached the end of the road in exploring beyond the quantum scale with the current approach. I do believe there are some dangling threads that can be pulled to help guide some other directions - such as the measurement problem and the no-communication theorem.

I also have some hope that computer theory may be able to shed some light - at the moment, we have a clear distinction between quantum algorithms and classical algorithms, but we don't know if they are fundamentally different or not. A discovery of how to efficiently compute quantum algorithms on classical computers (i.e. BQP = P or at least BQP = NP) would likely be a major insight into QM itself. Conversely, proof that quantum computers are fundamentally different from classical computers would also hopefully show WHY/HOW they are different and help in this area. Alternatively, if it turns out that quantum computers are in fact not physically realizable or require exponential memory or energy would essentially prove that physical reality does not obey some of the properties that make quantum computers (apparently) more powerful than classical computers.

edna314
> Probably no reasonable human being would assume that Quantum Mechanic as we know it is the "holy grail" of physics.

Fully agreed. This, however, rests on the assumption that there is a 'holy grail' in physics (aka a theory of everything). It's a matter of taste, but I don't like that idea that there is a theory of everything because it doesn't seem reasonable that every last bit of our universe is explainable by a single theory. Why would that be? Seems like a conspiracy, if this was the case. To me, it's rather quite comforting to assume that there is something (be it the position and momentum of an electron) which we won't ever be able to understand because it just seams realistic. If this wasn't the case and we could show that there is a theory of everything I would immediately start my quest to find Morpheus to ask him to give me the right pill to wake up.

renox
Strong assertions.. I think that you'd better study the EPR paradox (E as in Einstein), Bell's inequality https://en.wikipedia.org/wiki/Bell%27s_theorem and the Alan Aspect (and other) experiments.

To summarize: reality isn't "local" even though we cannot send information faster than light..

eigenket
There are (at least) two local, deterministic interpretations of quantum mechanics (many worlds and superdeterminism).

You are correct that Bell-type experiments put very strong constraints on such theories. If your theory is local it is going to be very weird.

renox
As these 'interpretations' provide no new prediction, I think that Occam's razor apply here..
eigenket
Occam's razor is not trivial to apply when everyone claims that their own interpretation is the simplest.
l33tman
This is not meant as a snarky comment - but I would be happy to hear how many worlds is a local theory (as I guess it isn't a fully operational theory in the sense that QM+Born rule is). Even if you assume the independent, local evolution of all "branches" of Psi, at some point you need to explain interference (as that is an observed phenomena in our real world) and the comparison of different swaths of Psi's is not a local operation. How is this handled in that interpretation?
eigenket
By local I mean local in the sense of special relativity. Stuff at a point in spacetime only affects stuff at that point and in its causal future. If you want to affect something somewhere else then you have to fire a light particle or whatever there.

Interference between different "branches" of the wavefunction is local if you're comparing the value of the two branches at the same point(s).

Edit: I should emphasise that interference is a completely local (in the sense I use it above) process in every interpretation of quantum mechanics I am aware of (even Copenhagen) generally the thing that makes quantum mechanics non-local in various interpretations is measurement.

l33tman
Yeah I guess we're thinking similarly in that case :)

It IS annoying though that it seems like the "last step" to produce measurable results requires a global integration though even though all the field interactions are local. The path integral QFT approach even makes this explicitly manifest, and modern QFT is the gold standard of reality-matching-predictions no matter the field..

Oh BTW re your edit: I'm not sure what "local interference" would mean. If interference effects could be solved locally, there would be no entanglement, no violated Bell's theorem or EPR experiments etc.

eigenket
I think maybe we're talking about different things? Could you be more precise about the sort of interference effects you're talking about?

I generally wouldn't call entanglement an "interference effect". When I think about interference my mind goes to the patterns emerging from double slit experiments.

Consider a quantum teleportation experiment, I start with

N (a|0> + b|1>) ( |00> + |11> )

where N fixes the normalisation

I do a measurement on qubits 0 and 1 in the basis ( |00> + |11> ), ( |00> - |11> ), ( |10> + |01> ), ( |10> - |10> ) and observe the outcome associated with ( |00> + |11> ), so now my state is

N ( |00> + |11> ) (a|0> + b|1>)

Now we collect our grant money because we've teleported the state to the other end. I don't see where the interference effect(s) were, from my perspective everything looked local and nice except for the impact of measurement, which is nonlocal in the "standard" picture and unexplained in general.

I think my perspective on interference comes from the fact that water waves and sound waves (for example) exhibit interference in pretty much the same way that wavefunctions do, but we can't break Bell inequalities with them since we don't have the same weirdness around measurement.

scotty79
Shrodinger cat is not a physical thing. It just comes from interpretation which has no mathematics behind it. There are many equivalent interpretations. And they are equivalent because none of them postulates any additional math to observe.
mrmonkeyman
He tried. Many people tried.
IIAOPSW
Quantum mechanics is a deterministic theory. It does not have "complete uncertainty at the quantum scale". The probabilistic aspect only comes into play when you cast a quantum state down to a classical observable quantity like position or momentum.

You may have missed the subtle but important aspect of the uncertainty principle. It is not the case that there is a true underlying value of position and momentum for which the error bars are limited by quirk of theory. It is more like you expect to be able to observe 2 bytes of position information and 2 bytes of momentum information but the underlying quantum state is defined by 2 bytes in total. The full 4 bytes cannot fit inside 2 bytes. You can have a state which encodes the 2 bytes of position information but no momentum information, a state which encodes 1 byte of each, a state which encodes 2 bytes of momentum information but no position information, and everything in between. The uncertainty principle is actually a non-existence principle. When the position information exists the momentum information isn't uncertain, it just outright doesn't exist.

DebtDeflation
>It is more like you expect to be able to observe 2 bytes of position information and 2 bytes of momentum information but the underlying quantum state is defined by 2 bytes in total. The full 4 bytes cannot fit inside 2 bytes.

So the programmer who wrote the simulation program took a shortcut. "No one will ever look at reality at this granular of a level, we should be good." Figures.

Seriously though, great explanation.

treeman79
https://m.youtube.com/watch?v=RlXdsyctD50&feature=emb_rel_pa...

PBS Spacetkme has a great video on pilot wave.

Getting rid of all the messy probability issues is amazing.

eigenket
I apologise for replying with what is essentially a rant, but as far as we can tell simulating quantum mechanics is exponentially (in the precise, mathematical sense) harder than simulating classical mechanics.

I find it very "triggering" when people use this metaphor of uncertainty as a programming shortcut, when my life (currently working on simulating quantum systems) would be substantially (exponentially) easier if reality was classical.

IIAOPSW
The metaphor still works if you view it as a programming shortcut done on a quantum computer.

>Hey Alice, we don't have that many qubits. Think you can put position and momentum on the same register? Sure thing Bob.

DebtDeflation
I think we may be talking about different simulations. I'm talking about THE simulation.
simiones
I think the GP understood this, but their point was that it is a false assumption that introducing the uncertainty principle in a classical simulation would simplify the computation.

That is, if we think that the universe is a simulation running on a classical computer, then quantum effects such as the uncertainty principle CAN'T be a "shortcut" to make the simulation easier - as in fact they make the simulation literally exponentially harder. If we assume that the universe is a simulation running on a quantum computer, then quantum effects are fundamental anyway.

Of course, everyone understands that this is a simple joke. Still, it's interesting that it goes completely against our intuitions as computer simulation designers.

FeepingCreature
Interested layman, but this always kind of rubs me the wrong way.

I consider it a universal warning sign when your physical theory postulates that reality works a certain way underneath, and then conspires to keep you from exploiting this way. Superluminal collapse, for instance, or closed timelike curves; it seems to my intuition that these are clearly flaws in the theory rather than properties of reality. In a sense, the sci-fi concept of the ansible, where quantum collapse effects are exploited for FTL signaling, is more plausible than the view where quantum systems collapse instantly, but cannot be exploited for FTL. If your theory requires FTL in the "backend" to operate, it shouldn't end up with FTL just happening to not be exploitable in the "frontend". So I expect that when we find the true ToE, it's not going to end up with traits where the data structure in principle allows a certain capability, but coincidentally doesn't give us access to it; all limitations of that theory would follow directly from its structure, so that illegal operations cannot even be expressed. So when we find a practical limitation to the exploitation of a physical theory, then this seems to me to require, barring very good theoretical reasons to the contrary, a genuine underlying property of the universe that justifies this limit.

As such, it seems to me, purely from an intuitive view, that of those three claims:

- reality is quantum

- quantum mechanics requires exponential cost to simulate

- quantum mechanics cannot be exploited for exponential computation

One has to be false! If reality requires exponential cost to simulate, we shouldn't expect it to then turn around and hide that capability from us. God famously doesn't play dice with the universe; but I doubt he's playing Poker either. Personally, I suspect quantum mechanics is cheaper than is widely believed.

Idk, am I missing something obvious?

eigenket
I think almost everyone in the field would reject your third claim. There is a great deal of work being done on applying the speedup afforded by quantum mechanics to classically hard problems.

The current consensus (at least among people I speak to) is that a quantum computer will probably be able to solve some problems which are in NP (and not in P), but will probably not be able to solve an NP-hard problem.

Your intuitive view is appealing, but I don't think one should get too hung up on intuition. Our intuition was honed for millions of years to avoid large wild animals and avoid falling out of trees in prehistoric Africa. It generally sucks pretty badly at making guesses about the fundamental nature of the universe (or at least mine does).

Edit: by the way a faster than light communication "ansible" is functionally equivalent to a machine that sends messages back in time. I disagree with your view that such a machine is more plausible than the view that some ftl stuff happens on the "backend" but is hidden from the "frontend". Having said that my view is that there is no ftl stuff on the back or front end, this is consistent with something like the many worlds interpretation of quantum mechanics, which is entirely local and causal (no ftl stuff).

FeepingCreature
I'm aware and agree that quantum computer speed is very probably greater than classical and may allow some NP. My point is more that I'd expect the performance cost of QM to equal the complexity class of QM. I don't know, is that the case in current research? Whenever people talk about how expensive it is to compute QM, that always sounds a lot higher than the actual performance they get back out. Or maybe I just have a bad imagination for how high NP really is.

Disagreeing again with the "Ansible is unrealistic" view: I agree it's unrealistic in a global sense, I just think all of its unrealism comes from the use of FTL in the backend. FTL + relativity produces timetravel, yes. I just think it's implausible to expect a physical universe to first provision the mechanics for FTL, then provision a system that gives you timetravel when exposed to FTL, and then very carefully separate those two systems so that they computationally never touch, even though in theory they could! and so in a sense, this physical theory implies that this universe has to be prepared, in principle, to allow time travel and account for it, and having gained this capability has elected to specifically not use it. It just stinks of a massive Occam violation - the potential for time travel is an "entity without necessity", and the theory is bloated by it twice over; once by its inclusion in the physical logic and once again by the additional exclusion mechanism that makes it unusable in every reachable state. And so the ansible universe pays slightly less cost just by leaving out the global censor.

Am I saying "no correct theory can look like this"? No! I am saying it enters the race with a huge handicap.

simiones
> My point is more that I'd expect the performance cost of QM to equal the complexity class of QM.

My understanding is that that is a yes - with a quantum computer, quantum simulations [are expected to] run in linear time instead of the current exponential time.

eigenket
I think this is not correct. A fundamental problem in quantum simulation of quantum systems is the Hamiltonian problem, where you give me a Hamiltonian H, which is the sum of polynomially many little local Hamiltonians H_i each of which acts on at most k subsystems of your big total system.

Then you give me two numbers a and b (with some technical constraints) and ask me whether the ground state energy of H is between a and b.

This problem is QMA-complete as long as k >= 2, and known to be QMA-hard even for some pretty nice looking Hamiltonians. Here QMA stands for Quantum Merlin-Arthur which is the complexity class equivalent to MA (Merlin-Arthur) for classical computers. You can think of MA as being related to BPP in the same way that NP is related to P, and this is the same way that QMA is related to BQP.

Basically QMA problems are not expected to be solved efficiently by a quantum computer.

simiones
I am way out of my depth here, thank you for weighing in!
simiones
> - quantum mechanics cannot be exploited for exponential computation

As far as we know for now, [a variant of [0] ] this one is the wrong claim: we do have algorithms (Shor's, Grover's) that are faster on Quantum Computers than any known classical algorithm.

Of course, we do not yet have any proof that a faster classical algorithm can't exist, so the possibility remains open that this claim remains true. BUT, if this claim turns out to be true, then it is extremely likely that either one or both of the other 2 claims will immediately turn out to be wrong.

[0] I would note that the "exponential" part is somewhat of a red herring. If there is any kind of asymptotic speed-up that QCs fundamentally have over classical computers, even if it's just quadratic, then the larger point will remain valid.

> In a sense, the sci-fi concept of the ansible, where quantum collapse effects are exploited for FTL signaling, is more plausible than the view where quantum systems collapse instantly, but cannot be exploited for FTL.

This is also what rubs me the wrong way about entanglement, and seems a very tantalizing thread to pull at if one wanted a more fundamental theory. The other thread of course is the measurement problem.

eigenket
Your two threads seem to be essentially the same thread. The "weird stuff" in entanglement comes pretty directly from collapse as part of measurement. If you drop this and go to something like many worlds then things look a bit nicer, but then you have so solve the measurement problem in many worlds, which seems quite hard.
Koshkin
But collapse is fake news, as it were. A state can be called a “superposition” only relative to a particular set of basis states, which, in turn, depends on what observable we are talking about, and the “collapse” is simply yet another way a state can change.
simiones
No, collapse or some alternative to it is a fundamental component of quantum mechanics. Without it, quantum mechanics makes wildly wrong predictions.

The Schrodinger equation almost always predicts that a particle has some particular amplitude at many different locations. However, if you try to detect a particle, you will never find it at more than one location. Furthermore, once the particle is detected at one particular location, you need to update the Schrodinger equation to give it amplitude 1 at that location and 0 everywhere else - if you don't perform this nonlinear update and instead use the old wave-function, the predictions for further experiments will be completely off.

Now, the physical interpretation of this collapse varies wildly between different interpretations of QM, but some variant of it is always required - otherwise, the observations simply do not match the math. We don't yet have any theory consistent with QM that can make predictions without applying the Born rule.

FeepingCreature
I believe in Many-Worlds, the update step falls out of decoherence "for free." Because you are also in superposition after the measurement, the "you have measured 1" part of your wavefunction of course measures amplitude 1. That doesn't give you the Born rule though - I believe there's some attempts to get it out of game theory [1], where they show that given sufficiently long timespans, "almost every" worldline has a history with quantum measurements whose distribution matches the Born probabilities.

[1] https://arxiv.org/abs/0906.2718

dguest
It's also worth skimming the wikipedia page on quantum decoherence [1].

Quantum mechanics forms a consistent theory without any concept of randomness or wave function collapse, and decoherence explains how this evolves into a classical observation when things get "big". There is nothing fundamentally random about this evolution. We still use random numbers (wave function collapse) to approximate observations, because in many cases quantum calculations for macroscopic objects would be intractable, and because they give the same answer to any precision we can hope to measure.

Every experiment that demonstrates quantum entanglement is essentially just showing that quantum mechanics still applies on systems that were previously thought to behave classically. And none of these experiments have demonstrated a mechanism for wave function collapse beyond what we already understand from decoherence.

[1]: https://en.wikipedia.org/wiki/Quantum_decoherence

simiones
No, that is a misunderstanding of what decoherence does or does not buy you.

Decoherence does explain why macroscopic systems can't (normally) exhibit self-interference and some other quantum effects.

However, decoherence doesn't in any way solve the other aspect of the measurement problem: why the object is only ever found entirely at one location instead of appearing with different amplitudes at multiple locations, as the Schrodinger equation predicts. That is the source of the "fundamental randomness" and it is not explained by decoherence.

For a slightly more mathematical (but still very understandable) explanation of what decoherence solves and what it doesn't solve, I recommend this article [0] by Sabine Hossenfelder.

[0] https://backreaction.blogspot.com/2020/08/understanding-quan...

dguest
Exactly, you end up with objects appearing at multiple locations that don't interfere with each other. This is completely consistent with observation, it's just weird: it means all the outcomes actually happened and the observer just can't see it.

So decoherence either fails to solve the measurement problem or solves it by saying the universe is weird.

simiones
It is certainly not consistent with observations. Imagine you set up two detectors at two locations. Imagine further that the schrodinger equation shows that the particle has the same non-0 amplitude at both locations.

Nevertheless, you will NEVER see both detectors detect the same particle. You know with 100% certainty that if one detector "saw" the particle, the other one didn't (this is true regardless of how far apart the detectors are).

Furthermore, imagine an experiment like this:

1. A particle P1 is fired, and it has some amplitude at locations Det1 and Det2.

2. At location Det1, a second particle P2 is made to collide with P1; the collision will throw P2 towards location Det3.

Now, the correct probability of finding P2 at location Det3 depends on whether Det1/Det2 are activated. If Det1 detects P1, the probability for P2 to reach Det3 is 0, but this is inconsistent with the Schrodinger equation for P1 and P2. To make the results consistent with observations, you must update the wave function for P1 and P2 after Det1 has detected P1.

Now, the MWI tries to circumvent this by postulating that both results happen at the same time, just in different worlds; and that Det3 becomes entangled with Det1, so their results are perfectly correlated. But that runs into other problems, both philosophical (how could the other worlds be "real" if they are fundamentally undetectable) and formal (e.g. it is not yet agreed that a consistent, non-circular definition for probability in the face of many worlds can be given that is consistent with the observed probabilities computed according to the Born rule).

Edit: I should also mention that without the Born rule, if Det1 and Det2 were both going to launch a satellite if they detected P1, then both satellites would have some amplitude in orbit; and you would start expecting to see the gravity of both satellites, which breaks off with experimental observations even more sharply.

kgwgk
> The probabilistic aspect only comes into play when you cast a quantum state down to a classical observable quantity like position or momentum.

That's also part of quantum mechanics. At least if you are trying to do physics.

https://www.math.columbia.edu/~woit/wordpress/?p=10533&cpage...

ilikedthatone
wow this is "the" best explanation I have ever heard as an outsider of physics. thank you. actually this made me feel comfortable with the universe and effort explaining it. I think any one who speak about information must use bytes!
mhh__
I'm not convinced it's right. Einstein was wrong on some aspects of it too, i.e. local hidden variables.
simiones
That is one interpretation, but the reality is that we do not know. In fact, what you are describing is the position known as anti-realism - that quantities that can't be measure are not real, a major feature of the Copenhagen interpretation.

But in other theories (those termed "realist"), such as Pilot Wave or Superdeterminism, the uncertainty principle is exactly an artifact of measurement. A particle has an exact position, momentum, energy, spin etc, but they can't be reliably measured all at once.

In the end, these are philosophical positions, outside of our current mathematical and physical tools. Perhaps in time we will be able to settle these questions, but for now there are no definitive answers.

lisper
> In the end, these are philosophical positions

Not really. Realist theories are actually just rhetorical sleight-of-hand. Bohmian positions, for example, are supposedly "real" but they cannot actually be measured, and this inability to measure them is not a technological limit, it is an inherent part of the interpretation. Bohmians essentially take the randomness, hide it in the infinite digits of a real (no pun intended) number, call that number "the actual position of a particle" and claim that, because it's "the actual position of a particle" that it is somehow "real". Well, it's not real. You can't measure it. You've just hidden the randomness behind some clever rhetoric.

> the reality is that we do not know

Yes, we do. The Bell inequalities and concomitant experiments show that quantum randomness is fundamental. You can hide it or sweep it under the rug but you can't get rid of it because it's actually part of our reality.

jostylr
The Bohmian model is that particle positions are an intrinsic part of the evolution of the theory. This is in contrast to, say, the Copenhagen interpretation in which the particle position is not a part of the evolution of the system. In the CI, wave functions are real (I guess) and we need something external to the model, called measurements, which mathematically do some complicated operation to the wave function of putting it in an approximate eigenstate (it can't do it exactly nor is the approximate nature specified and the timing of this is generally specified but not really clear about time of arrival measurements in scattering experiments).

In the Bohmian model world, particle positions would be real. They may not be measurable to the inhabitants to infinite precision (assuming quantum equilibrium, i.e., the particles are distributed according to psi-squared), but the particles have definite postions in this world. It has in it no need to postulate anything about measurement. Observables and all that are deduced from the theory.

It is important to ask what in a model has a fundamental existence, what has an implied existence, and what is just a useful term for something secondary to all that. In Bohmian mechanics, particles (things with position) have a fundamental existence, the wave function has an implied existence because that is how the particles move about, and spin has a secondary existence since it does not have a separate existence from either of those two objects, being solely deduced from the motion of the particles.

The Bohmian model is well-defined and has an easy correspondence between the entities that are real and our experience (stuff has position). It does not require observers to be complete. A Bohmian world could happily exist without PhDs, humans, mammals, life forms, etc.

What's real in CI? Not really sure, but I guess the wave function and measuring devices? It does require something that corresponds to measurements and the evolution of that classical world is distinct and separate.

This is the context of calling BM a realist theory. I would just call it an actual theory.

Bohmian mechanics illuminates the quantum randomness, allowing it to be analyzed. The theory happens to be deterministic (ignoring creation/annihilation), but it derives the apparent randomness from all that and it clarifies the meaning of psi-square.

simiones
> What's real in CI? Not really sure, but I guess the wave function and measuring devices? It does require something that corresponds to measurements and the evolution of that classical world is distinct and separate.

The wave function and any of its behaviors are not considered Real in CI, quite the opposite. CI posits that the wave function / schrodinger equation is just a mathematical tool that can be used to model the behavior of quantum phenomena - it is merely a tool that we can use, together with the Born rule, to predict the outcome of measurements. The measurements are the only things that are real. That's why CI is also sometimes summarized as "shut up and calculate" :).

simiones
There are many classical systems that have properties we consider real that are nevertheless not practically measurable. For example, grains of sand have definite position and momentum, yet you can't in reality measure them for each grain of sand when you want to describe the flow of a sand in an hourglass. So the practical ability to measure something is not required for us to consider something "real".

Now, Bell's inequalities show that any theory of local hidden variables independent from the measurement apparatus is inconsistent with QM and with experimental observations. However, that still leaves the possibility of non-local hidden variables possible, and it also leaves open the possibility that the experiments have been flawed - that the measurement decision was somehow correlated with the result.

The Copenhagen interpretation is BOTH non-real and non-local, so I don't find a real non-local theory a priori problematic. Whether pilot wave theory can be made consistent with at least special relativity is still open as far as I know, but perhaps there is some equivalent of the no-communication theorem that will be found.

Superdeterminism is also seeing a resurgence - I can't claim to understand how it could be made into a real scientific theory, but perhaps there is something that is escaping me and we will see a successful theory come out of it.

Now, the matter of whether something can be considered "real" but fudamentally un-measurable (as opposed to just being practically un-measurable) is in the end a problem of definitions.

I would also note that there is no fundamental quantum randomness - quantum phenomena behave according to a linear equation, so they are fundamentally deterministic. It is only measurement (the interaction between quantum systems and classical systems) that introduces randomness - the superposition of all the solutions to the Schrodinger equation is replaced with a single position&momentum with some probability, and with error bars that are subject to the uncertainty principle.

lisper
> grains of sand have definite position and momentum

No, they don't. The uncertainty principle applies just as much to grains of sand as it does to electrons. The only reason they appear to have definite position and momentum is that their masses are large, so a very small velocity produces a very large momentum which gives them a very small wavelength, much smaller than their physical extent. But the uncertainty principle still applies. With the right technology you could do a two-slit experiment on grains of sand. See: https://www.wired.com/story/even-huge-molecules-follow-the-q...

tsimionescu
> With the right technology you could do a two-slit experiment on grains of sand.

This is an assumption, not yet proved by experiment.

Regardless, I was talking about the classical model, not about whether QM applies to macroscopic objects. My point was that even in a perfectly classical model, there are unmeasurable quantities that are nevertheless considered real.

Koshkin
To be fair, the parent was talking about "classical systems," not about the applicability of quantum-mechanical principles to macroscopic objects.
lisper
But there is in reality no such thing as a classical system. The world we live in is quantum. Its behavior approaches that of a classical system as the number of entangled degrees of freedom grows, to the point where classical physics becomes a damn good approximation. But reality never actually becomes classical. The wave function does not actually collapse. And grains of sand do not have actual positions and momemta.
simiones
Whether this is actually true or not remains an open question, but it is indeed the expected consequence of QM. However, there is as yet no experiment showing any kind of quantum behavior for huge objects (closest is a few hundred million atoms as far as I know).

Still, the point was about a classical model, not about physical reality being classical or quantum. To rephrase, my point was that even in a classical model there are quantities that can't practically be measured (even though classical mechanics as a model assumes all objects have definite positions and momentums), and that doesn't stop us from accepting in such a model that they are 'real'. So, the idea of property that is 'real' but not really measurable already exists in classical physics, it is not an invention of exotic interpretations of QM.

jostylr
> Whether pilot wave theory can be made consistent with at least special relativity is still open as far as I know

It can be made consistent, namely, adding a foliation (a notion of a canonical present moment) makes it work out fine. The foliation is not detectable, it gives the right predictions, it is mathematically sound, etc.

But it does feel philosophically sketchy and not fundamentally relativistic. However, there is a paper https://arxiv.org/abs/1307.1714 which explores how to derive foliations from the wave function itself and thus would be as relativistic as in theory involving the wave function.

The article also points to other articles with a variety of ideas how to deal with nonlocality in Bohmian mechanics.

contravariant
I think they're begging the question a bit when it comes to thermodynamics. In their video on http://constructortheory.org/what-is-constructor-theory/ they simply claim that no constructor exists that can turn a cooked egg into a raw egg and that allows their theory to reason about thermodynamics. Now presumably this is meant to be about entropy and not merely the amount of energy. However given the exact right amount of energy it is possible for a constructor to 'uncook' an egg, it's just that this constructor will work only for exactly that egg at exactly that time.

The resolution is that you're extremely unlikely to pick the right constructor but then we're back to probability again. In fact anything the recovers the laws of thermodynamics simply ends up recovering probability theory as the laws of thermodynamics are inherently probabilistic.

quirky_mind
Looks like someone reinvented operators. Or am I missing something here?
contravariant
Well I guess they're assuming the operators are more physical than theoretical and they should conserve things (so I think they're limited to hermitian operators). But yeah that sounds about right.

Not that I mind that they're looking for a different foundation but you can't just gloss over the problem of explaining statistical physics without statistics...

Chinjut
Their definition of "constructor" is as something that will perform a task reliably (whose meaning they define as including repeatably) for any choice of input with the specified attribute. If you can't come up with some reliable repeatable way of turning any cooked egg into a raw egg, then you don't have the thing they call a "constructor" for turning cooked eggs into raw eggs.
myWindoonn
Knowing a bit of maths, I'm having trouble understanding how this isn't just categorical logic. They say that the laws of physics should be expressible in terms of statements about which physical transformations are possible or impossible, and why. In categorical logic, laws are expressible in terms of statements about which morphisms exist or would be contradictory/absurd to exist, and include proofs. What's the difference?

I read through a couple of their preprints but didn't find much concrete material, just a lot of complaints about existing theoretical physics traditions.

lisper
I did a deep dive into Deutsch's game-theoretic interpretation of QM (which he talks about mid-way through the talk) a while back. This is the result:

http://blog.rongarret.info/2019/07/the-trouble-with-many-wor...

markus_zhang
Watched 30 minutes and pretty much didn't get what he wanted to express, which is pretty fine because I'm not well versed in the art of Physics.

Just one thing about probability. My theory is that we use probability because we don't really understand the true "why" of things, or we don't know them. For example, we use probabiliy to guide us when playing poker, but that's because we don't know the cards. Whence we know the cards there is no need for probability.

But again, he is probably not speaking about the same "level" of things. I wish I knew more Physics.

jvanderbot
All models are wrong, but some are useful.

I'm with you, we use probability to wave away effects we don't understand or cannot measure. There's a flat rate probability that each of us has cancer right now, when perfect knowledge of our bodies could answer the question precisely. Each doctor exam we have reduces or increases that probability until we subjectively understand it's time to sample a point of the body and know for certain.

The local sample is related to the larger system, and the hidden variable (cancer or not). The local sample at the time of capture has a probability of being malignant. We measure directly and back propagate the results to the system.

But, unintuitively, there's been some famous results that seemingly preclude a deterministic explanation of quantum mechanics. See:

- https://en.wikipedia.org/wiki/Hidden-variable_theory

- https://en.wikipedia.org/wiki/Bell%27s_theorem

f430
This is absolutely fascinating. This could be something quite ground breaking.
roenxi
It is an interesting approach to physics and good luck to them. But the video isn't promising. The reason a physicist models a thing as a random variable is because they don't know of a way to distinguish the thing from a random variable.

Dr. Deutsch spent far too much time talking about the fact that at a high level there are a lot of things we are certain about. That is very weak evidence of what is happening at the quantum level. The important part isn't conceiving of something as deterministic. That is easy to do. Proving it experimentally will be the hard part.

There is nothing to report until there is an experiment that people can hold up saying "see! we found it! This part is deterministic not probabilistic!".

thingification
From memory of listening to this a few times some time ago (caveat: I certainly didn't study the content of this video carefully, but I do understand some significant context of his work as he sees it reasonably well I think):

Deutsch here isn't trying to modify quantum mechanics, he's trying to "interpret" it. He wouldn't use the term "interpret" of course, for excellent and fascinating reasons that I can't quickly cram into an HN post, but you can find them in his book The Beginning of Infinity (I was convinced about this area of what he says in the book, not about everything else he says there -- there's a LOT of stuff in there!).

By the way, I don't know whether or not you meant to imply that Deutsch is not a physicist, but he is -- and well known for his work on the fundamentals of quantum physics.

scotty79
> Dr. Deutsch spent far too much time talking about the fact that at a high level there are a lot of things we are certain about. That is very weak evidence of what is happening at the quantum level.

Especially given that sum of billion random variables is determined and pretty precise value.

YeGoblynQueenne
Well, one's horoscope is also a set of "determined and pretty precise" values but that's not to say that we have learned anything useful from looking at a person's horoscope, except from what can be informed by a belief that it influences a person's behaviour.

That is to say, that the value of a variable can be precisely known doesn't say anything about the utility of knowing that value. Accordingly, the talk in the linked video is not arguing that the values of random variables cannot be determined with precision, rather that doing so is not sufficiently informative for the purposes of understanding physical reality.

thingification
> There is nothing to report until there is an experiment that people can hold up saying "see! we found it! This part is deterministic not probabilistic!".

To add a bit more on top of my earlier comment: I believe Deutsch would point out that all experiments that have ever been done do corroborate the determinism of quantum phenomena. Experimental tests are only ever understood in terms of theory, and quantum theory, which Deutsch identifies with the many worlds "interpretation", says that the world evolves according to Schrodinger's equation. Other "interpretations" of quantum theory are actually rival theories that fall at the first hurdle of being worse theories. To Deutsch, a bad theory is one that is easy to vary -- in this case, for example, the "Copenhagen interpretation" says that QM -- i.e. Schrodinger's equation -- is true, except when you measure something -- and then it says that you're not allowed to ask what that means. Why not? You're just not, OK?

This may sound like sophistry, but it isn't. Consider a closely analogous case:

Denying that experimental tests of QM have corroborated that theory -- which is a deterministic theory -- is like saying that dinosaur bones are not evidence of dinosaurs because none of the experimental tests distinguish the theory that dinosaurs really existed from the one that says that God put the bones there: true, but that theory is ruled out because it's a bad explanation (easy to vary: why God, for example?).

simiones
The MWI doesn't really solve the measurement problem - it still doesn't explain in any way why two different detectors that according to the Scrodinger equation should both see the same amplitude for a particular particle at their location will never both detect the particle - only one of them will.

However, we also know that the original particle can actually interact with two different particles at the two different locations.

MWI can't actually explain this well-confirmed observation (and no, decoherence doesn't explain it - it just explains why macroscopic systems don't exhibit interference with themselves or with other macroscopic systems).

At a more relevant practical level, there is no theory that can predict what value a quantum measurement will have beyond probabilities. This is an easily observable fact - and finding the way to perfectly predict the value of a quantum measurement would probably make QM inconsistent with special relativity for example.

IMHO, a better title would be "Probabilistic Theories Of Everything"

Contrast with the recent work from David Deutsch, which eliminates all probabilities from physics.

Intro to Constructor Theory: https://youtu.be/wfzSE4Hoxbc

platz
this is basically just what happens in any deterministic hidden variable theory
asdfasgasdgasdg
Yeah, the aliens who are simulating us could be using a 64 bit prng at the quantum level and we would still never know the difference, because our measurement technology isn't there. The universe would be deterministic but it's unlikely that we'd ever be able to prove it, unless they left breadcrumbs intentionally.
Johnjonjoan
Let's be honest at least one of them must have done some premature optimization. That's what we should be looking for!
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.