What if Mormonism has its dualism backwards?

The catch-all forum for general topics and debates. Minimal moderation. Rated PG to PG-13.
User avatar
Gadianton
God
Posts: 5351
Joined: Sun Oct 25, 2020 11:56 pm
Location: Elsewhere

Re: What if Mormonism has its dualism backwards?

Post by Gadianton »

physics guy wrote:So people have tried, for more than 150 years now, to derive thermodynamics as a necessary consequence of the other, microscopic laws of nature. I'm one of those people—this is my main research topic.
Imagine that, like Joseph Smith, I ask a question and get the answer straight from the source.

It's one of those very thorough answers that has so much in it, it's hard to know where to start.
So, in statistical mechanics, entropy is a measure of the number of different ways something could be, while still counting as the same thing
Without realizing it, I advanced a type-identity theory of physics (not necessarily good). Imagine a jar of pennies spilled on the floor. My post was saying that there only really exist the pennies on the ground in the states they are in and each state is equally likely. HTHHTH is just as likely as HHHHHH. Any particular arrangement of gas molecules in an enclosure is equally likely. You reply with type-tokenism. The macro state 6h is less likely (more order) than the macro state 4H 2T. Ockham vs. Duns S-c-o-t-u-s (thanks Shades): are there only individual things or are the categories of things also real? Type-token requires categories/macro states to be as real as microstates.

Since the universe is determined, if we could slow it all down and watch each microstate unfold according to the laws of nature as you pointed out, then -- ? that's it. There are no macro states such to move from a higher order to lesser order. God creates the laws and watches them play out The end. We can't track them playing out as God does, so the second law is real epistemically, as we entertain counterfactual physical possibilities.

How would that affect thermodynamics as the role of sheriff?

I don't know. The second law is baffling, it's about the most intuitive rule in physics. So intuitive that it's hard to image why everything hasn't already washed out. How was there ever enough order to create the universe in the first place? It must not govern whatever is beyond the universe, branes, or whatever. Well, magnetism doesn't affect branes either, but intuitively, why would it? Whereas, intuitively, everything, including branes should be bound the the second law. They shouldn't be able to bang together and never wind down. But yet, they do. So as intuitive as it is, demoting it from ontology could bring some peace of mind.
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance
User avatar
Physics Guy
God
Posts: 1937
Joined: Tue Oct 27, 2020 7:40 am
Location: on the battlefield of life

Re: What if Mormonism has its dualism backwards?

Post by Physics Guy »

Gadianton wrote:
Sun Mar 30, 2025 3:32 am
HTHHTH is just as likely as HHHHHH. Any particular arrangement of gas molecules in an enclosure is equally likely. You reply with type-tokenism. The macro state 6h is less likely (more order) than the macro state 4H 2T. Ockham vs. Duns S-c-o-t-u-s (thanks Shades): are there only individual things or are the categories of things also real? Type-token requires categories/macro states to be as real as microstates.
The coin analogy can stand pretty well for a lot of the issues in statistical mechanics. I think that the 19th century physicists who developed statistical mechanics may also have had a bit more philosophy behind them than Ockham and the Supreme Court had, in their day, because they didn't seem to be stuck with just the question, "Real or not real?" They could distinguish between a thing being real, and a property being real. If things can belong to different categories, then which category a thing is in can be a real property of the thing, even if we never say that the category itself is a real thing in the way that the thing is real. You can open a green door, and the door is really green, even though you can't make a door out of green.

It's a real property of HHTHHT that it belongs to the set of outcomes with two T out of 6. And if you have 27 million air molecules in a cubic micrometer of air (which you roughly do), the average kinetic energy per molecule in that sample is the temperature of that small block of air. It's a perfectly real property of the sample of molecules, even if no individual molecule in the sample happens to have exactly the average kinetic energy.

Entropy in statistical mechanics is still a bit funny in that it seems to give an awful lot of attention to counterfactuals, even if at the end of the day it disavows them, kind of like a preacher who keeps making off-colour jokes in his sermons. Like, to define the entropy of that micrometer sample of air, you first need to know its temperature. Cool, that's a clearly real property, just the average of all the individual energies. But then for entropy you need to know the size in 6x27-million-dimensional configuration space of all the possible ways for that number of molecules to have that much average energy. The entropy is the logarithm of this 6x27-million-dimensional volume: that's the formula on the gravestone of Ludwig Boltzmann.

The individual sample of air that we actually have is just one point in that volume. All the other points are counterfactual. So now with entropy we've gone a weird step farther, somehow.

The temperature of the air sample was a real property of the sample. That temperature defines a category: all the possible samples that would have the same average energy. Being part of that category is also a real property of our individual sample: d'uh, we just said it does have that temperature. But now we're taking a property of the category itself—its total 6x27-million-dimensional size—and saying that it's a property of the individual sample (namely its entropy, after taking the logarithm).

This is like saying that you drive a Ford, so you're part of the set of all Ford drivers, and so now it's one of your characteristics that there are 43 million people in your car brand group. Well, okay, you might say, you guess that's technically true; but somehow that 43 million figure doesn't really feel as though it's about you. Entropy is like that, but worse, because the many other ways for our air sample to have the same temperature aren't really there, like the other Ford drivers. They're only counterfactual possibilities.

No, it's cool, statistical mechanics insists. We're not saying the other ways of having that temperature are really there, or anything. Oh, heck, no. They're just a mathematical property of having that temperature: the temperature defines a volume in configuration space. The actual air sample, that we really do have, really does have this temperature. So it also really does have all mathematical properties that go with this temperature, including the entropy. See?

It still seems weird. You may not have to believe in any counterfactuals, to define the entropy, but you do have to think about them. You have to think about them enough to count them. How many of them there are—or could have been—seems to matter, for what actually happens. That might not quite make the counterfactuals real, but it's as if they get a little consolation prize for not being real. Kind of spooky.
Since the universe is determined, if we could slow it all down and watch each microstate unfold according to the laws of nature as you pointed out, then -- ? that's it. There are no macro states such to move from a higher order to lesser order. God creates the laws and watches them play out The end. We can't track them playing out as God does, so the second law is real epistemically, as we entertain counterfactual physical possibilities.
In practice the speed of molecular motion is what makes thermodynamics important. At the other end of the speed spectrum, we can look at the motion of stars, for instance of huge clumps of stars in globular star clusters. The stars may be moving at pretty high speeds, but in proportion to the distances between them, they move really slowly. The kind of dramatic changes in momentum that happen in air within picoseconds may take millions of years for the stars. So we normally follow those individual motions of stars, without any trouble. We're not stuck having to look only at averages, as the only things that don't change blindingly fast. If we do look at averages, though, we find that they line up really well with the expectations of statistical mechanics. So it does seem as though the statistical features of large numbers of particles are a real property of the sets of particles. The only subjective part is that we may care more or less about the statistical features, as opposed to the individual ones, depending on time scales.
The second law is baffling, it's about the most intuitive rule in physics. So intuitive that it's hard to image why everything hasn't already washed out.
The Second Law isn't really as intuitive as people make it out to be. In particular there isn't really a good explanation for its so-called "arrow of time". The Second Law of Thermodynamics is just about the only physical law that distinguishes between forward and backward in time. It makes this dramatic difference between them. Forwards means entropy increases; if entropy is decreasing then the film must be running backwards. The microscopic laws, in contrast, just make a one-to-one mapping between past and future. It's just as much one-to-one between future and past.

It would be easy to understand why paper burns into ash, but ash never spontaneously assembles itself into clean sheets of paper, if it were true that there were many more ways of having a pile of ashes, with rising smoke and emitted light, than there were ways of having a clean sheet of paper. This is just not actually true, though. There are exactly as many ways of having light and smoke converge onto ashes, to make the ashes unburn into paper, as there are ways for the same paper to burn into ash. It's a one-to-one mapping.

Many different possible ways of making a heap of ashes, under a smoke plume with light flying away, all seem kind of equivalent to our minds. It seems to us as though you have to get things a lot more precise than that, to have a clean sheet of paper. So it seems intuitive to us, to think that there are more ways of having the ashes than of having the paper, and thus we may be tempted to nod our heads when somebody tells us that this is all it means for entropy to have to increase over time, and isn't it simple. In reality, though, most of the ways of having a heap of ash that we can imagine, that are so much more numerous than the ways of having a sheet of paper, are states that can never actually happen. They can't happen, because there is no way to get to them by burning a sheet of paper. Like, maybe they are the ash heaps that would come from burning a weird fractal spaghetti tangle of fine paper fibers, which nobody will make.

So the Arrow of Time, and the Second Law, are actually more mysterious than a lot of popular accounts would have one believe.

(There's a small caveat here. The nuclear forces do not actually have time reversal symmetry. They do seem to distinguish between forwards and backwards in time. They don't distinguish in a way that has anything obvious to do with entropy; the asymmetrical effects in question are tiny; and they are restricted to sub-nuclear scales. So there's a possibility that nuclear forces are somehow responsible for the fact that I cannot remember tomorrow, but if they are to blame, then there must be a long story about this. It is not at all obvious. For all of electromagnetism, which includes all of chemistry, we should be able to ignore these little subnuclear details. And within electromagnetism and gravity, at least, there is time reversal symmetry.)
How was there ever enough order to create the universe in the first place?
This, I'm afraid, is one of the leading arguments for the universe having been created, at some finite time in the past, rather than always having existed. It is indeed hard to understand why, if there has already been infinite time, we should be so lucky as to still be as far below maximum entropy as we are. There have been speculations, but the ones I know are all along the multiverse line, and so they're pretty badly anti-Okhamic in their own way, postulating infinitely many copies of everything that we can never observe.

Even if one buys that time itself is only fourteen billion years old, there needs to be a story about why we haven't already hit heat death. The story is stars. On Earth, usually, the Second Law's requirement for entropy increase means that heat only spreads out, and cannot spontaneously concentrate. If it did, we'd just wait for that to happen, and then run heat engines from the hot spots. In systems like interstellar gas clouds, however, where gravity is the dominant force, it turns out that entropy increases when heat concentrates, not the other way around. The term of art is, "Self-gravitating systems have negative specific heat." This lets stars form spontaneously. And then once you have stars, you have a lot of lovely temperature gradients, in which all kinds of greeblies can grow.
I was a teenager before it was cool.
User avatar
Gadianton
God
Posts: 5351
Joined: Sun Oct 25, 2020 11:56 pm
Location: Elsewhere

Re: What if Mormonism has its dualism backwards?

Post by Gadianton »

I've had a few conversations with DeepSeek on most of the topics covered ^^^ in order to save you some effort, as I try to find a way out of the corner.

Your AAAA com signal is striking because it shows that order isn't very interesting. God as a being of high order is simultaneously a being of low-bandwidth. The "second law" (...or is it the second scam?) is self-defeating. White sand pieces and black sand pieces mixed together is a high entropy state until the colors erode over time and suddenly it's low entropy again. The universe starts out high order, evolves into higher entropy, the higher entropy is also higher bandwidth where people evolve to have this discussion. The arrow of time increases up until the point of maximum entropy/bandwidth (long past people), but ultimately everything rips apart and black holes evaporate. At a certain point, the arrow of time reverses if the scorecard is signal compressibility. This isn't a state of reality that is "intelligible" to humans such as ashes turning back into paper. It just acknowledges that redundancies appear and what is ultimately left is perfect uniformity that becomes perfectly compressible, a string of AAA's like at the big bang. I think this is Penrose's CCC universe sans the geometry (that guy really likes shapes). Not saying "rebirth" happens, just saying the arrow of time is self defeating. Beings caught in the middle are indifferent between which direction the arrow of time goes while physics still works by the same micro laws of radioactive decay.

The problem for the sheriff before time begins is that if we take all possible universes and then apply compression, there is only one real possibility, and that is the "All A" universe, which includes the inflation field, which seems to indicate the only possibility is what we ended up with.
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance
User avatar
Physics Guy
God
Posts: 1937
Joined: Tue Oct 27, 2020 7:40 am
Location: on the battlefield of life

Re: What if Mormonism has its dualism backwards?

Post by Physics Guy »

Gadianton wrote:
Tue Apr 08, 2025 3:46 pm
White sand pieces and black sand pieces mixed together is a high entropy state until the colors erode over time and suddenly it's low entropy again.
There is a difference between mixing two kinds of sand, and having a string of zeroes or ones, but in neither case will total entropy decrease. In the case of the string of bits, the ordering of the symbols carries no information or entropy: they are by definition simply one after another, with bit number 17 bound to follow bit number 16 no matter what, with only that one possibility. In the case of sand, where each grain sits is going to be one possibility out of many, so there is a lot of entropy in the positioning of all the sand grains, and not just in their colours.

To start with the bit string, where the order is fixed but each bit has two possibilities, we can still realistically consider that the logical distinction between zero and one must be represented by some physical characteristic of something, like color of sand grains. Realistically, furthermore, whatever that physical feature is, it may degrade over time, until the bits becomes indistinguishable after all. So this would allow a high-entropy string of random zeros and ones to degrade into a zero-entropy string of all zeros.

The only reason that it's reasonable for things to degrade into indistinguishability, however, is that there are lots of spontaneous processes in the world that do that kind of thing: rust, bleaching, whatever. All those processes happen spontaneously, being bound to happen by default eventually, precisely because they all increase entropy. If the black bits fade into white because UV light bleaches them, there are a lot of reflected photons flying away, that would have been different if they had hit white bits instead of black bits. The entropy of the bits is transferred to the light, probably with some extra. So the entropy of the row of grains might go down, but the total entropy of everything including the light only ever goes up. And the same principle applies no matter what process it is that renders the black and white bits indistinguishable.

In fact that point is important for computing: it's the basis of Landauer's Principle, which says that resetting a bit to zero, in a way that makes it zero regardless of whether it was zero or one, must create at least log(2) k T Joules of heat, where k is the Boltzmann constant and T is the ambient temperature. Landauer's Principle is still a Principle, and not a Law or a Theorem, because nobody has yet fully connected the dots between thermodynamics and information. But it's a widely accepted Principle, and it sets an ultimate limit to the thermal efficiency of computing.

So much for bit strings. For piles of sand, the locations of the sand grains are in principle variable. They could have been rearranged in lots of ways and still count as "a heap of sand". So there is a lot of entropy in a sand heap, regardless of the colorations of the grains. Just how much entropy the heap has depends on how distinguishable the grains are from each other: not just how well an observer can distinguish them in practice, but how different they are from each other, logically. This issue comes up in the Gibbs Paradox. The resolution to the paradox is just to be consistent about how distinguishable you are considering the particles to be.

For instance, if somebody puts up a sign with painted letters, then two of their letters can be switched on the sign, and that's an actual change. In an abstract word just as a word, some changes are also real changes: "loot" and "tool" are two different words. But in the word "loot", switching the two o's does not make a new word at all. It is not a change. There are not two distinct possible forms of the word "loot" differing only by which "o" is which, even though which painted "o" you put where does distinguish two distinct ways to put up a sign reading "loot". Two painted "o"s in a box of letters are logically distinct, no matter how physically similar they may be, but the two letters "o" in the word "loot" itself are logically indistinguishable. The implications of distinguishability for how many distinct possibilities there are mean that distinguishability must have effects upon entropy. So if we find a bunch of very similar-looking things, we should be able to check whether they are merely similar or actually logically indistinguishable, by measuring their entropy.

And we can. Atoms of the same isotope are logically indistinguishable, not merely hard to distinguish, and we know this because a thermodynamical consequence of logical indistinguishability of atoms in a gas is the possibility of Bose-Einstein condensation, which has been observed. It got the Nobel prize in 2001, and Bose-Einstein condensates are now produced routinely in labs around the world.
The arrow of time increases up until the point of maximum entropy/bandwidth (long past people), but ultimately everything rips apart and black holes evaporate.
Black hole evaporation—if it really happens as Hawking proposed—increases entropy still further. A lot of thermal radiation gets released.

The arrow of time should never reverse. It should just slow down more and more forever, as maximum entropy is approached asymptotically, with less and less going on in the way of persistent patterns. Statistically uniform randomness is not the same as uniformity, microscopically. It just looks the same on average over larger scales.
I was a teenager before it was cool.
User avatar
Gadianton
God
Posts: 5351
Joined: Sun Oct 25, 2020 11:56 pm
Location: Elsewhere

Re: What if Mormonism has its dualism backwards?

Post by Gadianton »

I've reviewed this material and it's a bit embarrassing to be making these suggestions, I'm sure I've calculated entropy a thousand times from chemistry but I remember nothing about it. Reviewing for missteps, I was banking on the atomic indistinguishability you mention allowing for cheating, somehow (the part I was working on), and also, pennies and cards are good analogies as along as applied appropriately. Even in the case of a deck of cards, defacing two cards obviously takes work, so even if the entropy from the deck is reduced it rises for the system. But the deck is not analogous to the gas in the box unless you toss the deck of cards on the ground and consider positions also. For the deck, the possibilities are 52! divided by 52! if all cards are defaced. But the number of positions when thrown on the floor is ridiculous. r / 52! if the cards are defaced. Even your own guy, Boltzmann, apparently missed this at first, when calculating entropy for gasses (not dividing by n!). The question about subjectivity in entropy appears resolved, and isn't what I thought I could get out of it, although I have to say it led to some off-the-rails conversations among people a lot smarter than I am.

The first battle is lost, but we've still got a problem with a universe hanging around for infinity at max entropy. Just tell me one thing. In some layperson explanations of quantum weirdness, a tennis ball tunnels across the galaxy. In others, I believe a tennis ball spontaneously appears. with a bunch of dead matter hanging around indefinitely, a tennis ball must appear. A huge one, in fact, will appear. Now is that quantum effect a brief fluke and then the tennis ball dissolves back into brahma, or does it stick around? If it disappears, then isn't that inconsistent with the tennis ball that tunnels across the universe getting to stay across the universe? If the tennis ball sticks around, well, the end of time is going to be one hell of a party with all the stuff popping in and sticking around. But, the tennis ball could perpetually reappear a billion times in the same place, and it will do so.

But what if the tennis ball is plutonium instead? Aha! A plutonium ball beyond critical mass --- does it explode? If the answer is no, because it pops out faster then the chain reaction takes to get going, what if the plutonium ball reappears enough times in a row to get the chain reaction started? Now if it were a tennis ball reappearing repeatedly in the same place, it would stay bright and green because it's just the illusion of the same tennis ball, it's like watching a movie, but if its a ball of plutonium, by atomic indistinguishability, it's the same ball, and it just has to reappear enough times for the chain reaction to kick in.
Social distancing has likely already begun to flatten the curve...Continue to research good antivirals and vaccine candidates. Make everyone wear masks. -- J.D. Vance
User avatar
Physics Guy
God
Posts: 1937
Joined: Tue Oct 27, 2020 7:40 am
Location: on the battlefield of life

Re: What if Mormonism has its dualism backwards?

Post by Physics Guy »

However it works with the plutonium tennis ball, it's not going to be the same ball with its internal clock stopping when it disappears and then continuing from that same point when it reappears, until it accumulates enough total hours of part-time existence to qualify for explosion. Quantum fluctuations are not add-ons to classical causality that have to play by its rules. They are part of how causality really works.

The exploding tennis ball is just another possible state that might appear. So is the aftermath of the explosion, and in fact there are probably a lot more aftermath states (or states that are just like them except in tiny details) than exploding or pre-exploding states, so it's probably a lot more likely to see something just like a late stage after a nuclear explosion (which is just a bit more thinly-spread warmth) than to see an explosion itself.

In any case I'm afraid it's not worth dwelling much on questions like this, because (a) it would be quite a lot of work to figure out exactly how small the chance is that quantum theory assigns to this or that wild event, but (b) it's clear that the unimaginably small chance will be far smaller than the fairly significant chance that current quantum theory is somehow wrong about extreme cases like these. Even a really good darts player doesn't bet on being able to make triple-twenty a million times in a row.

It may be worth dwelling a bit on a toned-down version of the question, where we ask whether a single hydrogen atom might appear spontaneously as a vacuum fluctuation. This is close enough to things that we do know that it's not just a wild shot in the dark, and it can highlight one of the biggest but subtlest concepts in quantum theory.

If you read something about electrons or photons popping in and out of existence, then the author has quite likely (though probably unwittingly) committed some terminological bait-and-switch, because there's an important ambiguity in terms like "electron" and "photon". Do we mean actual electrons and photons? Or do we mean non-interacting electrons or photons, which are not real particles but just concepts that we use as basic vocabulary to formulate our theory?

We don't have a clear theory for actual particles. We can write something down, but if you ask too many questions about what it means, we have to back off, and say, Okay, let's start from something simpler: what if everything was just neutral? What if there were electrons, and photons, but the electrons couldn't ever generate or absorb photons, or interact with each other in any way? Well, in that scenario, sure, here would be the right quantum field theory, and here's exactly what everything would mean, and everything that could happen (not much).

When we do this, we don't actually set out to make a theory of particles, at all. We have these quantum field operators, and after some algebra we discover, Whoa, the energy and momentum of these fields are quantised, and it would be a consistent way of labelling all the possible states of the fields to think of each state as containing some numbers of particles with various velocities. And that's all particles are. The whole notion of what particles are, and what they mean, is an afterthought, not an axiom.

Already in this wildly dumbed down, non-interacting quantum field theory, there are some surprising conclusions. Like, electrons are supposed to be these little hard points, right? In the dumbed-down world, their electric charge doesn't do anything, but they're point charges, right? Well, yes and no. There are point charges, but these are not the electrons, just like there are electric fields, but these aren't the photons. The photons and electrons are certain patterns and correlations in the charges and fields. Even if there are no electrons or photons at all, anywhere, there are point charges and fields, just in vacuum. They just aren't correlated together in the right ways to count as particles.

Once that's clear—so, in my course that starts next week, about three weeks from now, hopefully—we say, Now, what if these different quantum fields were to interact with each other infinitesimally? The interactions have to obey certain rules, like relativity, so there's stuff to work out about exactly what they can be—but they have to be weak. Then we can figure out what the theory means iteratively. For any given question about what would happen if ..., we start with the answer the non-interacting theory would give, and then patch it with corrections that take the interactions into account. Then we patch the corrections with corrections, and so on. Fortunately for us, the most important interaction in nature (electromagnetism) really is weak. Every time we crank up to the next set of corrections-to-corrections, we're looking two decimal places further to the right, quantitatively. So we can normally stop after just one or two rounds, and have an answer that's accurate enough.

You can get pretty good at doing that without even noticing the big idea, but eventually you will have to notice it, because you'll hit supposedly tiny corrections that seem to turn out to be not tiny but infinite. Like, the correction is this small number 1/237 (it really is almost exactly that) times this integral that we just have to compute, hang on a sec, OMG, the integral is infinity! And you'll go WTF and have to sit back and think.

Then maybe, especially if you're Feynman, the big idea will hit you. Now that we have interactions, even though they are weak, we have to re-do the analysis that made us identify particles within quantum field theory. What even are the particles, now? They're still correlations of some kinds among point charges and fields, but exactly what correlations? This is a lot harder now, with the interactions involved, but you can do it. It's just more work than you might have expected, figuring out what things even are, before anything even happens.

So now somebody asks something like, "Hey, two electrons walk into a photon, what happens?" You go, "crap, you mean actual electrons? Not non-interacting electrons? Aw, crap." And you spend a bunch of time figuring out, with corrections patched onto corrections, what the question even means. And then, dammit, the person probably wants to hear the answer in terms of what actual electrons and photons are doing, not the fictitious non-interacting ones. So even once you know the answer you're going to have to spend a bunch of time figuring out what it really means, in terms of actual particles.

So you finally give your answer and wind down with something like, "And there's a 0.001% chance that the final state will include a positron." And your questioner says, "Cool! I can see a positron!" And you have to go, "Whoa! You want to know what you see?" Because that's a whole other question. Detecting a fundamental particle is a physical process. Just because something is there doesn't mean you can just see it. What's your detector? How does it work? It's going to miss some positrons that are there, and it may sometimes give false positives. So once again you're back trying to figure out exactly what the question really means, and what its answers will mean. That's the bummer about fundamental physics. You're never off the hook for anything. You have to pin everything down.

So, to get back to the maximum-entropy end of the universe. There may be a quantum state that looks, in some ways, and as described in some books, as though there is non-zero amplitude for atoms and tennis balls and bombs and whatever to appear suddenly, anywhere. There's a pretty good chance, though, that if you frame the question carefully, and interpret the answer properly, then the answer just turns out to be, "Nothing happens. We just expressed `nothing' awkwardly."

When I say "pretty good chance" here, I don't mean the chance that is defined by quantum theory, because I haven't computed anything about this, because it would be hard. I'm only guessing that this is probably what quantum theory would say, if we worked hard on this question. Working out the chance for a single hydrogen atom to appear as a fluctuation in the standard model vacuum, in the objective sense that something could happen which could count as detecting the atom, would be a tough calculation. It might be a publishable paper, depending on how defensible your idea about "count as detecting" was. It might also be a publishable paper to come to a defensible conclusion about whether the asymptotic future of a maximum-entropy expanded universe would locally look like the standard-model vacuum, or not. I'd bet a beer that it would, but I've lost more beer bets than I've won on physics questions like this.
I was a teenager before it was cool.
Post Reply