• 0 Posts
  • 9 Comments
Joined 5 months ago
cake
Cake day: July 7th, 2024

help-circle
  • No, they are not, they are incredibly wealthy millionaires whose campaigns are bought and paid for by billionaires. The Democrat party is actively supporting an ongoing holocaust, an industrial scale genocide and ethnic cleansing of millions of people from their homeland. The idea that these people are all secretly saints who are just too scared to act on it is such a completely ridiculous belief. They do not do moral things because they are not moral. They are not saints. They simply do not represent those values. You elect a party that openly believes X and then claim they don’t do Y because they’re too scared to do it. No, they don’t do Y because they don’t represent Y, they represent X. Democrats are by no means in any way “soft-willed.” Whenever it comes to something they actually believe in, they are very good at rallying the votes to get it passed, such as when they are passing something in favor of the military industrial complex or the Israel lobby.


  • Democrats are heartless genocidal freaks, and hardly “spineless” they just don’t care. It’s a party of billionaires. I have no idea how you can unironically believe this ethos that they’re all a bunch of bleeding hearts but are just too scared, quivering in their boots to act but they all mean well… apparently! No, they just never fight for those values you want them to fight for because their party does not represent those values, and pretending they do at this point… I have a bridge to sell you.




  • Honestly, the random number generation on quantum computers is practically useless. Speeds will not get anywhere near as close to a pseudorandom number generator, and there are very simple ones you can implement that are blazing fast, far faster than any quantum computer will spit out, and produce numbers that are widely considered in the industry to be cryptographically secure. You can use AES for example as a PRNG and most modern CPUs like x86 processor have hardware-level AES implementation. This is why modern computers allow you to encrypt your drive, because you can have like a file that is a terabyte big that is encrypted but your CPU can decrypt it as fast as it takes for the window to pop up after you double-click it.

    While PRNG does require an entropy pool, the entropy pool does not need to be large, you can spit out terabytes of cryptographically secure pseudorandom numbers on a fraction of a kilobyte of entropy data, and again, most modern CPUs actually include instructions to grab this entropy data, such as Intel’s CPUs have an RDSEED instruction which let you grab thermal noise from the CPU. In order to avoid someone discovering a potential exploit, most modern OSes will mix into this pool other sources as well, like fluctuations in fan voltage.

    Indeed, used to with Linux, you had a separate way to read random numbers directly from the entropy pool and another way to read pseudorandom numbers, those being /dev/random and /dev/urandom. If you read from the entropy pool, if it ran out, the program would freeze until it could collect more, so some old Linux programs you would see the program freeze until you did things like move your mouse around.

    But you don’t see this anymore because generating enormous amounts of cryptographysically secure random nubmers is so easy with modern algorithms that modern Linux just collects a little bit of entropy at boot and it uses that to generate all pseudorandom numbers after, and just got rid of needing to read it directly, both /dev/random and /dev/urandom now just internally in the OS have the same behavior. Any time your PC needs a random number it just pulls from the pseudorandom number generator that was configured at boot, and you have just from the short window of collecting entropy data at boot the ability to generate sufficient pseudorandom numbers basically forever, and these are the numbers used for any cryptographic application you may choose to run.

    The point of all this is to just say random number generation is genuinely a solved problem, people don’t get just how easy it is to basically produce practically infinite cryptographically secure pseudorandom numbers. While on paper quantum computers are “more secure” because their random numbers would be truly random, in practice you literally would never notice a difference. If you gave two PhD mathematicians or statisticians the same message, one encrypted using a quantum random number generator and one encrypted with a PRNG like AES or ChaCha20, and asked them to decipher them, they would not be able to decipher either. In fact, I doubt they would even be able to identify which one was even encoded using the quantum random number generator. A string of random numbers looks just as “random” to any random number test suite whether or not it came from a QRNG or a high-quality PRNG (usually called CSPRNG).

    I do think at least on paper quantum computers could be a big deal if the engineering challenge can ever be overcome, but quantum cryptography such as “the quantum internet” are largely a scam. All the cryptographic aspects of quantum computers are practically the same, if not worse, than traditional cryptography, with only theoretical benefits that are technically there on paper but nobody would ever notice in practice.


  • the study that found the universe is not locally real. Things only happen once they are observed

    This is only true if you operate under a very specific and strict criterion of “realism” known as metaphysical realism. Einstein put forward a criterion of what he thought this philosophy implied for a physical theory, and his criterion is sometimes called scientific realism.

    Metaphysical realism is a very complex philosophy. One of its premises is that there exists an “absolute” reality where all objects are made up of properties that are independent of perspective. Everything we perceive is wholly dependent upon perspective, so metaphysical realism claims that what we perceive is not “true” reality but sort of an illusion created by the brain. “True” reality is then treated as the absolute spacetime filled with particles captured in the mathematics of Newton’s theory.

    The reason it relies on this premise is because by assigning objects perspective invariant properties, then they can continue to exist even if no other object is interacting with them, or, more specifically, they continue to exist even if “no one is looking at them.” For example, if you fire a cannonball from point A to point B, and you only observe it leaving point A and arriving at point B, Newtonian mechanics allows you to “track” its path between these two points even if you did not observe it.

    The problem is that you cannot do this in quantum mechanics. If you fire a photon from point A to point B, the theory simply disallows you from unambiguously filling in the “gaps” between the two points. People then declare that “realism is dead,” but this is a bit misleading because this is really only a problem for metaphysical/scientific realism. There are many other kinds of realism in literature.

    For example, the philosopher Jocelyn Benoist’s contextual realism argues that the exact opposite. The mathematical theory is not “true reality” but is instead a description of reality. A description of reality is not the same as reality. Would a description of the Eiffel Tower substitute actually seeing it in reality? Of course not, they’re not the same. Contextual realism instead argues that what is real is not the mathematical description but is precisely what we perceive. The reason we perceive reality in a way that depends upon perspective is because reality is just relative (or “contextual”). There is no “absolute” reality but only a contextual reality and that contextual reality we perceive directly as it really is.

    Thus for contextual realism, there is no issue with the fact that we cannot “track” things unambiguously, because it has no attachment to treating particles as if they persist as autonomous entities. It is perfectly fine with just treating it as if the particle hops from point A to point B according to some predictable laws and relative to the context in which the observer occupies. That is just how objective reality works. Observation isn’t important, and indeed, not even measurement, because whatever you observe in the experimental setting is just what reality is like in that context. The only thing that “arises” is your identification.


  • Why did physicists start using the word “real” and “realism”? It’s a philosophical term, not a physical one, and it leads to a lot of confusion. “Local” has a clear physical meaning, “realism” gets confusing. I have seen some papers that use “realism” in a way that has a clear physical definition, such as one I came across defined it in terms of a hidden variable theory. Yet, I also saw a paper coauthored by the great Anton Zeilinger that speaks of “local realism,” but very explicitly uses “realism” with its philosophical meaning, that there is an objective reality independent of the observer, which to me it is absurd to pretend that physics in any way calls this into account.

    If you read John Bell’s original paper “On the Einstein Podolsky Rosen Paradox,” he never once use the term “realism.” The only time I have seen “real” used at all in this early discourse is in the original EPR paper, but this was merely a “criterion” (meaning a minimum but not sufficient condition) for what would constitute a theory that is a complete description of reality. Einstein/Podolsky/Rosen in no way presented this as a definition of “reality” or a kind of “realism.”

    Indeed, even using the term “realism” on its own is ambiguous, as there are many kinds of “realisms” in the literature. The phrase “local realism” on its own is bound to lead to confusion, and it does, because I pointed out, even in the published literature physicists do not always use “realism” consistently. If you are going to talk about “realism,” you need to preface it to be clear what kind of realism you are specifically talking about.

    If the reason physicists started to talk about “realism” is because they specifically are referring to something that includes the EPR criterion, then they should call it “EPR realism” or something like that. Just saying “realism” is so absurdly ridiculous it is almost as if they are intentionally trying to cause confusion. I don’t really blame anyone who gets confused on this because like I said if you even read the literature there is not even consistent usage in the peer-reviewed papers.

    The phrase “observer-dependence” is also very popular in the published literature. So, while I am not disagreeing with you that “observation” is just an interaction, this is actually a rather uncommon position known as relational quantum mechanics.


  • I am saying that assigning ontological reality to something that is by definition beyond observation (not what we observe and not even possible to observe) is metaphysical. If we explain the experiment using what we observe then there is no confusing or contradiction, or any ambiguity at all. Indeed, quantum mechanics becomes rather mechanical and boring, all the supposed mysticism disappears.

    It is quite the opposite that the statistical behavior of the electron is decoupled from the individual electron. The individual electron just behaves randomly in a way that we can only predict statistically and not absolutely. There is no interference pattern at all for a single electron, at least not in the double-slit experiment (the Mach–Zehnder interferometer is arguably a bit more interesting). The interference pattern observed in the double-slit experiment is a weakly emergent behavior of an ensemble of electrons. You need thousands of them to actually see it.


  • What is it then? If you say it’s a wave, well, that wave is in Hilbert space which is infinitely dimensional, not in spacetime which is four dimensional, so what does it mean to say the wave is “going through” the slit if it doesn’t exist in spacetime? Personally, I think all the confusion around QM stems from trying to objectify a probability distribution, which is what people do when they claim it turns into a literal wave.

    To be honest, I think it’s cheating. People are used to physics being continuous, but in quantum mechanics it is discrete. Schrodinger showed that if you take any operator and compute a derivative, you can “fill in the gaps” in between interactions, but this is just purely metaphysical. You never see these “in between” gaps. It’s just a nice little mathematical trick and nothing more. Even Schrodinger later abandoned this idea and admitted that trying to fill in the gaps between interactions just leads to confusion in his book Nature and the Greeks and Science and Humanism.

    What’s even more problematic about this viewpoint is that Schrodinger’s wave equation is a result of a very particular mathematical formalism. It is not actually needed to make correct predictions. Heisenberg had developed what is known as matrix mechanics whereby you evolve the observables themselves rather than the state vector. Every time there is an interaction, you apply a discrete change to the observables. You always get the right statistical predictions and yet you don’t need the wave function at all.

    The wave function is purely a result of a particular mathematical formalism and there is no reason to assign it ontological reality. Even then, if you have ever worked with quantum mechanics, it is quite apparent that the wave function is just a function for picking probability amplitudes from a state vector, and the state vector is merely a list of, well, probability amplitudes. Quantum mechanics is probabilistic so we assign things a list of probabilities. Treating a list of probabilities as if it has ontological existence doesn’t even make any sense, and it baffles me that it is so popular for people to do so.

    This is why Hilbert space is infinitely dimensional. If I have a single qubit, there are two possible outcomes, 0 and 1. If I have two qubits, there are four possible outcomes, 00, 01, 10, and 11. If I have three qubits, there are eight possible outcomes, 000, 001, 010, 011, 100, 101, 110, and 111. If I assigned a probability amplitude to each event occurring, then the degrees of freedom would grow exponentially as I include more qubits into my system. The number of degrees of freedom are unbounded.

    This is exactly how Hilbert space works. Interpreting this as a physical infinitely dimensional space where waves really propagate through it just makes absolutely no sense!