[edit: I did not mean to sound exclusive; anyone can chime in of course!]
I could not resist. I decided to pull out some Victor J. Stenger and other scientists to weigh in on fine-tuning and the origin of the cosmos. Once I get going on a topic, I cannot stop. Ideas are like drugs, leaving me wanting more depth and breadth.
Physicists don’t seem to be in agreement on how to interpret the appearance of fine-tuning. I do not want this to be another piece that confirms someone’s atheism. I want it to be an evenhanded assessment of the evidence.
So the purpose of this post is to get qualified others to comment on these points before I follow up on a post.
- According to Stenger, an example of something not having a cause is radioactive decay and photon emission. These phenomena are modeled by statistical mechanics and are not predetermined but rather random. These phenomena have “no evident cause”, and do what they do spontaneously. Some, however, call this probabilistic causality. For photon emission, if an electron is in a higher energy state and moves to a lower state, then the emission of a photon occurs. This looks predetermined to me. There are plenty of phenomena modeled as stochastic processes that still have causes. A coin toss, for example, has a random binomial distribution, but it still has a cause. Can atheists say that these two processes, namely radioactive decay and photon emission, are not deterministic? I get the feeling that this is an unsettled issue as debate rages if the universe is deterministic or indeterministic.
- Is the following true in regard to the topic of the appearance of fine-tuning? “Constants, such as the speed of light and Planks constant, are irrelevant since these are arbitrary units that define the system of units being used. Only “dimensionless” numbers that do not depend on units, such as the ratio of the strengths of gravity and electromagnetism, are meaningful.” Furthermore, these constants are not independent of one another, so if we change one, then we would have to change another. But constants are inseparable properties of the relationship of interest. That is, if they are not there, the relationship doesn’t hold true. How can Stenger claim that they are “irrelevant”?
Thanks!
Musings
JM says
The speed of light is not irrelevant. That we use the speed of light as a reference for defining speed and distance doesn’t matter. In the past we used a reference bar to define distance and measured the distance traveled by light in a period of time to find it’s speed.
The important things with the fundamental constants are can they be different and how many constants are there. If a constant can’t have a different value then fine tuning doesn’t exist. If it can only have a small set of values then there doesn’t need to be much tuning. If the constants are not independent and changes to one inherently change others or define the range of others then there is less real tuning. The other issue is how many constants actually exist. If we were talking about constants before Maxwell we might list electricity and magnetism as separate constants. It may be the case that all of the forces are expressions of different parts of one ultimate force. This would reduce the number of constants that could be changed.
All arguments about fundamental constants run into an insurmountable problem given our knowledge of the universe. We don’t know many constants there really are or what the valid range is. Some of the constants could be nothing more then mathematical illusions because we don’t know how things really work and have fudged up a constant to make the formula roughly match reality. It is fundamentally trying to compute the chance of drawing an ace from a deck of cards but nobody will tell you how many aces there are in the deck or how many cards are in the deck.
Rob Grigjanis says
The puzzling thing about Stenger’s claim about ‘uncaused’ radioactive decay and photon emission is why he picked those particular phenomena. Pretty much the same ‘argument’ applies to anything in quantum physics.
Beta decay is caused by the weak interaction. Alpha decay is caused by quantum tunnelling. Photon emission (the name ‘spontaneous emission’ is an unfortunate holdover from the days before quantum field theory) is caused by the the ground state of the electromagnetic field interacting with the atom. Lifetimes or decay rates are all calculable, based on known mechanisms, so calling any of these things ‘uncaused’ is just bullshit.
Yeah, answers in quantum physics are probabilistic. That has never meant ‘uncaused’, until some twit (maybe Stenger?) decided atheists needed something to counter Kalam or some such shit.
Beats me.
Rob Grigjanis says
Bit of a can of worms there, because people use the same word, ‘deterministic’, for scenarios which are radically different.
Kirknoodle says
I suspect he meant the relationships between the constants and not the constants themselves are what matter, since many of these constants can just be set to 1, and which one you choose to do that is arbitrary, so it’s the relationship between that and another constant that, if you chose the first one to be 1, is rendered not 1, or vice versa, which matters.
But it is very strange and unclear phrasing, not good communication at all imo.
StonedRanger says
I find it odd that this is even a thing. Atheism doesnt say anything about any of this. Atheism is a response to a single claim and that claim is that some gods exist. There might be some scientists that are atheists who might have something to say about this. If someone thinks the universe is intelligently designed Id like to know how they figured that when they have exactly one universe to examine. Please, can some ID proponent tell me what a non designed universe looks like and how they were able to determine that?
Siggy says
Partially in response to Rob Grigjanis @2,
Yeah, I think beta decay and photon emission are just stand-ins for any quantum process. These are, I suppose the quantum processes that Stenger thought his audience would be most familiar with.
There’s more than one way to talk about causes. The cause of a pencil falling down is the laws of gravity. But another more specific cause is the earth. The first kind of cause is the rules of the system, while the second kind of cause is a set of events in the pencil’s past light cone.
In the context of the cosmological argument, which kind of cause is the kind that matters? I feel like cosmological arguments waffle a bit between the two kinds of causes. We’re supposed to imagine following a chain of causes backwards into time, and either the chain goes on indefinitely, or it stops at a single uncaused cause. But if we’re to imagine God as the rules of the system rather than a specific event in the past, then why wouldn’t the laws of physics equally suffice?
Siggy says
Quantum processes are non-deterministic in the sense that there isn’t any particular event that caused the light to be emitted at one moment instead of some other moment. However, it does depend on your quantum interpretation. Hidden variable interpretations such as Bohm would say that it is deterministic, but based on information in the system that you could never have known. In the many worlds interpretation, the system is deterministic, but it is subjectively non-deterministic because there is no way of knowing which branch of the many worlds you will end up observing.
IMHO, for the sake of the cosmological argument, it’s not actually important which interpretation is “correct”. The important thing is that the conventional non-deterministic interpretation is at least conceivably correct. If a cosmological argument is based on asserting that it uncaused events don’t make sense, the answer is, no it actually makes sense, and it’s a core part of a perfectly reasonable interpretation of physics.
With regard to constants of physics, many of these are not really universal in that they’re really just telling us about the size of our units, as in the ones that we defined ourselves. The truly universal constants of physics are unitless. I think there are about 25 of them or so.
Rob Grigjanis says
Siggy @7:
Sure, and in the double-slit experiment there isn’t a particular event that causes a particular detector to light up. But in all cases, there is still a causal chain of events. The implication that there is a “missing event” which renders the outcome in any sense “uncaused” is not, IMO, justified.