In January (2015), Marko Vojinovic wrote a two-part attack on reductionism over at *Scientia Salon* (Part I, Part II). Based on his reasoning, I’d like to offer a new definition of strong emergence as “holistic physics”. (Well, perhaps not that new; regardless…)

The idea is that any full description of the underlying-level dynamics must either refer to the emergent concept (strong emergence) or refer to concepts that it reduces to (weak emergence).

Let’s consider a physical system. It is described at some level of description by a certain physical theory, let’s call it the *effective theory*. There is also a more detailed description, let’s call it the *underlying theory*, so that when the details of these underlying dynamics are summarized in a certain manner you get the effective theory. For example, the behavior of gas in a canister might be described by the ideal gas law (the effective theory), while this equation in turn can be derived from the equations of Newtonian mechanics (the underlying theory) that apply to each molecule.

For now, let’s assume both the effective and fundamental theories *work*, they are not in error. We’ll address errors in a moment.

If the underlying theory is *mechanical *in the sense that it only discusses small parts interacting with other small parts (such as molecules interacting with other molecules) then we can say we have *weak emergence*: the “higher-level” behavior of the effective theory is reducible to a “lower-level” behavior of the parts. For example, we can define “pressure” as a concept in the effective theory, as a certain statistical property of the velocities and masses of gas molecules. If the movement of the molecules can be described by an underlying mechanical theory, a theory that only takes into account the interactions of molecules with each other – then we can calculate everything in the lower theory, and then “summarize” the right way to see what the result of this calculation means in terms of “pressure”, and in this sense talk of “pressure” has been reduced to talk of molecules.

If, however, the underlying theory is *holistic* in the sense that the small parts it talks about also interact with parts that are summaries of the small parts, i.e. with the concepts that the effective theory talks about, then we can say that we have *strong emergence*. For example, if the interaction of molecules in the underlying theory also refers to pressure (instead of just to other molecules), then pressure acts as a strongly emergent property – you cannot reduce talk about it to “lower levels”, since the lower level already includes talking about it.

# In the Real World

All indications are that physics is multiply mechanical – it is mechanical at various levels, not just the fundamental one. In other words, there is only weak emergence, but there is weak emergence at many levels: nuclei emerge from quarks, atoms emerge from nuclei and electrons; solids from atoms; and so on. In our investigations, we have never established an holistic scientific theory, a theory that refers to higher-level entities. And we have, on numerous occasions, seen reductive success – we were able to calculate, from underlying theories, aspects of or even entire effective theories.

Now, in his original piece Marko argued for strong emergence by shifting the burden of proof to those disputing it. But a mechanical theory is simpler (as he seems to agree) so more likely a priori, and reduction is empirically successful so it’s more likely a posteriori. (Reductionism has shown empirical success by deriving higher-level theories or aspects thereof, and by consistently finding that the underlying theories are mechanical.) Thus, “weak emergence” is well established and the burden of proof is now firmly on those wishing to overthrow this well-established theory.

# A Note On Errors

Why ignore errors? Because they are not philosophically interesting. If the effective theory is correct but the underlying one is wrong, then all we have here is a mistaken underlying theory. If the small parts it talks about do exist, a correct description of their dynamics can *always* be given, and constitutes the correct underlying theory (which, however, need not be mechanical!). If the small parts it talks of don’t actually exist, then either some others exist and we’ll settle for them or else no small parts exist in which case we can just call this “effective” the *fundamental* theory – a theory that has no underlying theory.

If the underlying theory is correct but the effective theory is wrong, then we have just miscalculated what the sums over the underlying theory say. It’s also possible we wrongly identified the summaries with concepts taken from other domains (e.g. that “pressure” as defined statistically is not what our pressure-gauge measures), but again this is not a very interesting question as all we need to do is to define properly what these new concepts are in order to see what the underlying theory says about them.

And finally, if both underlying theory and effective theory are wrong then we just have a mess of errors from which nothing much can be gleaned.

In all cases, the errors have nothing to do with emergence. Emergence relates to how things do behave, not to how things don’t behave.

# A Note on the Original Argument

In Part I, Marko attacked reductionism by three examples. First, he noted that the Standard Model of cosmology cannot possibly be reduced to the Standard Model of particle physics, because the latter does not include any dark matter while the former does. While correct, this simply indicates that one model is mistaken: the reason that the Standard Model of particle physics does not yield the Standard Model of cosmology is that the Standard Model of particle physics is *wrong!* That is not an indication that the actual dynamics of the particles are determined by higher-level concepts, such as (for example) whether or not they are near a sun. One cannot conclude from an error in the model that the correct model will show strong emergence.

As his second example, Marko noted that the Standard Model of elementary particles *with massless neutrinos *fails to correspond to the standard model of the sun. While true, this merely indicates a failure of the Standard Model, which has since been corrected (neutrinos apparently have mass!). It has nothing to do with emergence, which is all about correct theories.The failure of the zero-mass standard model did indeed indicate that the effective sun-model did not reduce to it, but it did so in a philosophically boring way – it said nothing about whether the sun model reduces to the corrected standard model, or more generally it said nothing about whether the sun model reduces to *any* underlying theory.

His third example is more interesting, in that he complains that one cannot explain the direction of time with appeal to the dynamical laws alone; one needs to make another assumption, one of setting the initial conditions. That’s not an issue of errors, at least. But again, his true statement has no implication for emergence. The initial conditions are set at the underlying level, at the level of each and every particle. This state in the underlying level then leads to a certain phenomena at the higher-level description, which we call the directionality of time (e.g. the increase of entropy with time). But that’s just standard, weak, emergence. There is no indication that the dynamics of the particles refers to the arrow of time – the dynamics always is mechanistic, referring only the particle-level description. Thus, not only is there no strong emergence here but there is an explicit case of weak emergence. Just as the sun (supposedly) emerges from *a particular initial condition *(a stellar gas cloud) in the corrected particle Standard Model and thus the sun-model is reduced to the standard model, so too does the arrow of time *demonstrably* emerges from *a particular initial condition* and thus the arrow of time actually *is* reduced to the dynamical laws. It’s one example, out of many, of successful reduction.

In Part II, Marko maintains that

“given two sets of axioms, describing the effective and the [underlying] theory, one cannot simply claim that the effective theory a priori must be reducible to the [underlying] theory.”

I think Marko here mistakes the *meta-scientific* theory that says “in our world, there is only weak emergence”, which follows from all of our science as well as from parsimony, with the *logical* theory that says “reduction must hold as a metaphysical principle”. I agree one cannot simply claim the effective theory *must* be reducible, but one *can* a priori claim it is *more likely* that there is one underlying mechanistic level (i.e. a “fundamental theory”) and that all higher-level effects are emergent from it, and one *can* claim a posteriori that weak emergence is overwhelmingly scientifically established.

Marko also raises a few other arguments in Part II, based on Godel’s theorem. He notes that there would always be true theorems that one cannot prove from a given (underlying) theory (this stems from Godel’s theorem). While true, this again has no bearing on emergence. For one thing – we’re discussing what’s true here, not what is finitely-provable. Secondly, there is no reason to expect that the unprovable theorems will lead to an holistic behavior of the particles described by the underlying theory, i.e. there is no reason to connect incompleteness to holism.

As his final argument, he notes that even if we accept an ultimate “theory of everything”, there would be uncalculable results from it. Again true, and again not relevant. In his example, he imagines there are six “gods” determined by this theory, and that their actions are incalculable. But if the “theory of everything” is a fundamental mechanistic theory, then the actions of these gods and hence all of what occurs is weakly emergent – even though it cannot be calculated. Whereas if the “theory of everything” refers to the overall brain-states of these gods (say), rather than to just the fundamental particles or so on, then the gods are strongly emergent phenomena. Whether there is weak or strong emergence has nothing to do with the uncalculable nature of these “gods”.

To clarify a couple of things. The arrow of time has NOT been reduced to statistical quantum mechanics. There is of course a statistical derivation which is termed the “arrow of time”, but it simply isnt. The reason is twofold and quite simple (the following can be fully elaborated but summarily presented here).

1. The arrow is not statistical in the sense offered, but that does not stop people who want to say as such.

2. The extra assumptions needed to make the derivation, are well, extra.

Well, in regards to (2) the point is that these extra assumptions are AT THE UNDERLYING LEVEL. No one is saying that they aren’t needed, but rather that they are not of the type that undermines the “reduction”. In THIS sense, the reduction holds – the arrow of time weakly-emerges from the underlying mechanics, much like the “temperature” weakly-emerges from underlying statistical-mechanics in standard statistical mechanics courses.

In regards to (1) – well, I’ don’t understand what you’re apparently referring to. The effect of increasing entropy is statistical in nature in that the future states are not guaranteed to have lower entropy, but the vast majority of them will.

No my friend they are not. Read carefuly. Yourself made such an extra assumption which is not of the “unerlying level”. That the “majority” will. i will not delve deeper at this comment, but simply point out how to elaborate more on this. For example read the Locksmidt objection to Boltzmann’s statistical derivation of the second law. Roughly the same still holds for the sattistical derivation from quantum mechanics as is done. Furthermore tha majority is not enough for the second law (read for example Kelvin’s objection and also “Where is the entropy Challenge?”, look it up). This is the key to point 1) of my comment.

Here it is

1. Where is the entropy challenge? http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.615.5917&rep=rep1&type=pdf

2. https://en.wikipedia.org/wiki/Loschmidt%27s_paradox

nikosms,

I am aware of the Locksmidt objection; I fail to see how it relates to the issue of reduction. Again – I freely concede one assumes more than simply the dynamical laws, one assumes there is a certain initial (emphasis on “initial”) condition. But this isn’t really important TO THE QUESTION OF EMERGENCE. Given the past hypothesis, the arrow of time demonstrably emerges from the underlying dynamics. One can then move on to other issues, like why should we grant that hypothesis; but the emergence itself is manifest.

In regards to your other source – it appears very interesting, but it is LONG, and I am not familiar with it. In particular, I don’t know where it argues that the “majority is not enough for the second law”. Perhaps you are referring to the fact that one needs to establish some kind of principle of indifference in order to “count” the states? That is quite correct. Again, however, this is only a recipe about how to characterize the underlying dynamics, not a switch from an underlying dynamics to holistic dynamics.

Regardless, in all approaches I am aware of to the derivation of the second law it is DERIVED from an underlying dynamics (using auxiliary assumptions, of course, such as ergodicity or the past hypothesis), rather than it being IMPOSED by holistic dynamics.

I would like to comment on your “Where is the entropy challenge?” source that in the quantum regime, I subscribe to the standard dynamical-semigroup approach they rightly describe as the prevailing opinion. Reading briefly, I fail to see why their critique underrmines this school of thought. They themselves clearly agree that in the Markovian limit the open system will indeed develop irreversibly. I would add that entropy increases as well, as proved by Alicki IIRC. So the only remaining question is whether this scenario is generic enough – to which the answer is clearly no, it’s merely suggestive. But they don’t appear to go there. Instead they raise other questions that, frankly, appear irrelevant to me. The interesting question is whether in general (although clearly not universally!) open systems will increase their entropy, even well beyond the Markovian limit, and if so what extra assumptions (such as a past hypothesis. negligible back-reaction, and so on) are needed. If answers to these could be established, I’d say that the emergence of the second law within QM has been established (as to what this means – this depends on your interpretation of QM…).

Starting from last.

The majority objection is based on Kelvin (if i’m not mistaken) and simply states that majority is NOT enough, and that given enough time (which is already given) the second law does not hold (statisticaly that is). The reference of the “entropy challenge” does not discuss this. But discusses the objective charactaristic of the second law, and also leading to its “non-statisticality” (if you dont mind the term). In fact this is just an introduction which can point you to more elaborate study along these lines.

However it seems the basic objection is whether the extra conditions used (or imposed if you like) are of the “underlying level” or not. The answer is simple they are not. How one might ask? In fact you have already given the answer yourself. You mention in the post that if average properties or statistical properties are used this implies striong emergence (and thus no derivation). And this is exactly the case. The majority condition is a kind of average condiiton

The initial condition is more tricky but completely ad-hoc, unrelated to the “underlying level”, it is chosen and imposed as such (in fact the majority is still used to make it into a derivation).

So there you have it.

To add a bit more, to make it even more clear.

The “initial” condition, “initial” vs “final”, “past” vs “future” already pre-supposes time-assymetry (this is the case of Loschmidt mentioned earlier), thus a time-arrow, there is NO derivation it is already pre-supposed.

i’m aware of the various “derivations” of the second law, and they all have in common what i highligthed in these comments.

Why are these used then? Nice question, let me make another one.

Why is the second law not considered basic (i.e “entropy challenge”) but needs to be “derived” (and failed), but schrodinger’s equation is considered basic (and does not need derivation)?

Before one rushes to add that for example QM has a large amount of experimental data, one should be aware that thermodynamics has an even larger body of experimental data.

This is the question you should ask (of course i have an answer myself, but better think for yourself).