Cover

Book 1:

 

 

 

 

 

 

 

 

 

Free Will and Reproducibility

Dirac topping Einstein – the giant’s battle: Sorry, Hawking – there is no number 3! And Schrödinger stands for classical quantum mechanics, and not for Quantum Gravity, which denotes the unification of Planck’s quanta with Einstein’s General Relativity. Indirectly, the battle was about Bell’s “hidden variables”. In 1936, Podolsky, Einstein, and Rosen had postulated them in order

  1. to overcome the problem of a “collapsing wave equation” in the measuring process of quantum theories,

  2. to override Einstein’s “spooky action at a distance”, which is known as “entanglement”.

Entanglement denotes the situation that a coupled system is defending its quantum coupling without any time delay, irrespectively of, then, overriding the speed of light. That clearly violates the limitations expected from causality. Einstein’s idea had been that his General Relativity might be incomplete; hidden variables could be an escape strategy. In 1964, however, the Irish physicist Bell published his no-go theorems telling us that hidden variables are not the solution to those problems.

 

Since that time, Bell’s no-go theorems conquered fundamental physics. Every theoretician proudly claims the non-existence of hidden variables in quantum systems quite generally. That opinion spread like a plague. Everybody who still dared to raise an objection quickly experienced the worldwide power of the big scientific lobby: nobody took him seriously any more. Thus, even Einstein became marginalised post hoc.

 

The irony, however, was that Einstein turned out to be correct, indeed! For, in 1985, Bell himself admitted in a BBC television report that his no-go theorems crucially based on his tacit assumption that there existed in nature something like a free will. Without a free will, however, his theorems are fading to nothing.

 

Bell called the result thus corrected and replacing his no-go theorems an “absolute determinism” or, shorter, a “superdeterminism”. According to that superdeterminism, hence, everything should be uniquely predestined and unchangeable for all times: there should be some general consistency condition embracing the entire world without any exception.

 

This meant an open declaration of war towards our western civilisation as grown during thousands of years. Just think of our jurisdiction and its sanctions against crime. Provided everything has been predetermined already, then the accused would be innocent – the culprit would be the superdeterministic combination of events our forefathers once had declared to be a crime! This, however, mistakes that those sanctions irrevocably are part of that superdeterminism, as well.

 

As a result, nobody took Bell’s 1985 insights seriously. Instead, his outdated no-go theorems went on flourishing till today. This is supported by the purely technical fact that Bell’s BBC-interview hardly is suited to be quoted by an official journal. In our present world, remarkably, a subjective rumour once hastily fixed by the official opinion leaders outweighs any future objective correction. Hence, up to now, almost nobody dared that loss of face to apply superdeterminism to particle physics nor to cosmology.

 

Theoretical physics is defined as the mapping of (parts of) nature into mathematics. We only perceive what our senses are telling us. But they might tell us nonsense, as well. Serious physicists, hence, only accept what can be reproduced unambiguously. The main trait is the reproducibility of physical results. This is its distinction from religion, which is working with irreproducible “miracles”.

 

Thus, it is amusing why the existence of something like a “free will” has been able to be kept upright that long, although its implications clearly are not reproducible. For this reason alone, it is surprising why in physics the hypothesis of a “free will” could have formed at all and, then, been preserved that long.

 

Finiteness and Atomism

Another feature of physics is its atomistic nature detected by Planck when he was working at his black-body radiation law in 1900. Even before, ancient Greek philosophers already had speculated about it. This atomism, however, should be evident to every layman, indeed. For, nobody can count up to infinity. In physics, hence, everything must stay finite in order that we are able to keep the survey over it and can describe it in a unique way. Without a unique description, however, reproducibility hardly can be checked!

 

Finiteness, when extended to systems of real numbers, teaches us that fundamental physics only admits rational numbers. For, irrational numbers need an infinite number of (non-repeating) decimal digits. A finite set of elements, however, can be separated and counted. This yields the above atomistic structure in terms of "quanta".

 

Classical physics denies that atomism. Classical physics is assumed to be continuous. For continuous systems, the infinitesimal calculus had been invented. The mechanistic view of our world has been thriving with it for centuries. And people are trying to keep it upright still today. Schrödinger’s continuous wave equation exemplifies the resistance by which disciples of that mechanistic view of our world are facing Planck’s discrete quantum view still actually.

 

Now, a continuous description might also be interpreted as the limiting case of a superposition of discrete features. This is the wave aspect of statistics. But don’t turn a blind eye to the fact that smoothed statistics are the result of a limiting process, which tacitly includes the extrapolation towards an infinite number of elements! That extrapolation means indirectly including additional elements into consideration which are not present there from the beginning.

 

Those “hidden variables”, of course, are unphysical, ambiguous, pure phantasy. You could choose them however you like. They are what Bell’s no-go theorems are excluding for combining causality with entanglement. Their inclusion, however, is the source for a macroscopic extension of a basicly microscopic world.

 

Let us adhere: A macroscopic description contains more parameters than experimentally measured! We all are familiar with it from what we hear from quantum mechanics. There are components of classical vectors which in quantum mechanics are not commeasurable simultaneously any more. The traditionally quoted example is the 3-vector of spin, whose components on the x- and y-axes are not any more unique provided somebody has measured its value on its z-axis before. Classically, however, in the macrocosm, all 3 components still are measurable simultaneously.

 

What did happen there? Well, a quantum theory just is describing the microscopic situation, where there is just 1 state and 1 (“diagonal”) measurable direction. The macroscopic view, however, is less exact: The result of measuring some state A – say at a position z – might be some state B – say at a position z’. Provided the difference z’–z, now, is negligibly small with respect to the absolute value z, its measuring result could approximately be equal to z, and B, then, could “approximately” equal A.

 

According to Bell’s words, those microscopic deviations are “hidden” with respect to the macroscopic world – disguising that the final state B is not exactly equal to the initial state A. Thus, the macroscopic world is working with approximations manifesting themselves in terms of superpositions of a great number of states almost coinciding.

 

(The fact that we could switch over from a diagonal z-direction to – say – a diagonal x-direction, e.g., is quite a different issue based on transformations. This does not involve Bell’s argumentation.)

 This new definition of what is "macroscopic" according to Bell's superdeterminism is the most important feature of New Physics. It cannot be underestimated! Planck's summation of a finite number of discrete quanta replacing their continuous integration is the key to that problem. By his method, the singularities inherent in classical potentials are made finite and simple. In their discrete forms, Yukawa or even Coulomb potentials might result in a non-singular way.

 

Faster than Light

According to Pythagoras, a (squared) distance is measured by adding the squares of its components. In 2 dimensions that distance defines the radius of a circle (or, when stretched, the principal axes of an ellipse), in more directions a sphere (or an ellipsoid).

 

Now, Einstein demonstrated that, in order correctly to describe Maxwell’s electrodynamics, the time direction has to be multiplied by the imaginary unit, in addition – which is not present in his 3 space directions. When inserted into Pythagoras, this squared imaginary unit will effectively switch the positive sign of its squared time component over to a negative sign. The result is Special Relativity.

 

This sign switch in the time direction transforms Pythagoras’ circle (or sphere) to a hyperbola (or hyperboloid, respectively). Contrary to a circle or an ellipse, a hyperbola, however, has 2 separate branches which do not touch each other. This transformation from a compact circle or ellipse to a non-compact hyperbola, triggered by that sign switch, thus, is

  1. ripping the original circle or ellipse into 2 pieces,

  2. stretching and squeezing the rest.

When, formally, starting with an equal distribution of some point concentration on the surface of a circle or sphere, those transformed points will result on the hyperbola or hyperboloid in a form which is considerably distorted. In physics, the (negative) gradient of a point distribution is defined to be a “force”. On the hyperboloid, hence, there are acting (geometrical) forces which are absent on the original sphere! And those forces will become extreme (velocity of light) at those sections where our original figure is torn to pieces.

 

Both figures, however, as distinct as they are looking, are point by point related to each other. Thus, it is a mere exercise for a mathematician to write down the exact formula of that “expansion” (of one coordinate system in terms of the other one).

 

When keeping some of the coordinates fixed while letting run the remaining ones, the total set of points, depending on those fixed values, will be sliced into sections which are orthogonal to each other. But each of those coordinate systems will slice the complete set of points in a different way (cf. the red vs. the green way):

 

 

(Provided not all lines are properly printed in these sketches: Just enlarge its scale of representation!)

 

In classical physics both slicing schemes would be related by an r-number formula (giving the same quantized real “Lie algebra” – whatever this might be); in Quantum Gravity, however, the relation will be given by a c-number formula (giving the same “complex Lie algebra”). The point, hence, is: QG is distinguishing the compact, “closed” representation by the “reaction channel” from its (formally) non-compact, i.e., “open” representation by the “dynamic channel”.

 

In both cases, the “points” represented by these “channels” may be called “generators”, because Quantum Gravity we are treating here, contrary to classical physics, is a thoroughly quantized model; quantum mechanics and Einstein’s General Relativity just are limiting models reflecting classical physics.

 

As both channels are describing the same set of points, the finiteness constraint of our closed reaction channel automatically is transferred to the dynamic channel, as well. Its infinite, asymptotic nature, thus, proves to be cut off somewhere. In physics this means: That “pseudo-open” dynamic channel, hence, will have to be represented by finite-dimensional representations, as well!

 

Classical physics is working with infinite representations, instead. All its technical problems are arising from those unphysical, infinite singularities neither needed nor observable. Quantum Gravity is avoiding them from the start, already. Clearly, it is a hard work trying to convince the elder generation to abandon their technical prejudices cultivated and expanded that long. It is their international lobby which is preventing QG from gaining a proper foothold in science – in spite of its breath-taking experimental success.

 

Dynamics, hence, is bounded. Contrary to the case of classical physics, there are no singularities. This does not mean that our universe is bounded by some rim we could knock at. A better picture would be that our point distribution is thinning out more and more towards some imaginary limit. And somewhere we are passing its last point without observing that there will be no more point behind.

 

Probability amplitudes are summing up according to Pythagoras. That “conservation of probability” postulated by physicists, hence, is a property of the reaction channel. Likewise, entanglement is working in the reaction channel. On the other hand, dynamics is a manifestation of the dynamic channel, and causality is a property defined by dynamics. In classical physics, both channels are identified with each other. Bell’s no-go theorem is based on this identification.

 

In the above sketch, however, that point x shown there cannot simultaneously march into the red, vertical direction and into the green, horizontal direction. Although the full red domain is equal to its full green counterpart, those individual slices denoting probability conservation in the reaction channel and lack of dynamical motion in the dynamic channel are not identical – as classical physics tacitly is assuming. But this contradiction just is the source of Bell’s no-go theorems.

 

In QG, both channels can be expanded into each other! Thus, Bell’s contradiction disappears: causality and entanglement both are true side by side. Only, both channels are not commeasurable with each other! (Just compare it with the spin components.)

 

Historical Background

Before 1900, physics still had been a hotchpotch of individual disciplines, all more or less independent of each other. Thereafter, in the course of the 20th century, a melting process started. Even chemistry turned out just to be a combination of quantum mechanics with thermodynamics. Biology and medicine, however, still resisted any unification.

 

Physics, now, went going to be regarded as an outcome of the variational principle, which is intimately related to the Lagrangian model. Both of them had been developed in the 18th century. The variational principle had become the highlight of treating mechanical problems. By mathematics, the Lagrangian formalism is based on the non-discrete, continuous “functional analysis of many variables”. Crucial is their property to subordinate all what might happen under just one single parameter.

 

In physics, this 1 parameter, usually, is chosen to be time. Even those notorious “string models” are admitting just one “time-like” dimension only, with all remaining dimensions being demanded to be “space-like”. We shall observe, however, that this restriction will turn out to be too stringent for physics.

 

The 20th century spectacularly started with Planck’s introduction of discrete “quanta” replacing continuous structures, followed by Einstein’s relativity theories. Both models, that of relativity and that of quantum theories, rapidly continued to develop to complete “field theories”. Even quantum theories still used that giant machinery of a continuous functional analysis for dealing with the discrete problems of quanta, cf. Schrödinger’s method.

 

Thus, the world of physicists proved not yet to be mature for QG: The requirements of the variational principle and of the Lagrangian formalism prevented the unification of quantum theory with Einstein’s General Relativity. Its main obstacle had been that duality not understood between the dynamic and the reaction channel.

 

The preceding chapter has been dedicated to strip that unjustified restriction off fundamental physics by redefining physics to be based on what mathematicians would denote as a model of generators. (Their “complex Lie algebra” is the highest common denominator of both channels. And a generator is represented by a square matrix.)

 

In n dimensions, the n diagonal entries of a matrix are commeasurable. This is the microscopic view of physics. In the macroscopic view, however, by an approximation using the law of great numbers and resorting to their superpositions, all n x n generators of an nxn-matrix representing a generator can be made approximately commeasurable by applying appropriate statistics.

 

All these results derive from the mathematical discipline of “group theory”. Spin is a notion of group theory, too. Einstein did not like group theory. Hence, his General Relativity does not incorporate spin; for GR, spin is an “alien”. In group theory, however, spin is one of its fundamental properties. This might be another obstacle preventing both theories from being united successfully, so far.

 

The main notion in group theory, however, is “irreducibility”, telling us which combination of quanta is belonging together in order to build up a particle, e.g., and which does not. Like spin, this notion that important for group theory has not been used by Einstein in his GR.

 

On the other hand, that “irreducibility” is the basic notion allowing to write down the “world formula” Einstein, hence, never could find. For, the invariants of group theory just are defined by that irreducibility; they are called “Casimir operators”, there. The “world formula” Einstein, thus, never had found, for every Casimir existing, hence, must read

 

 

(A “Casimir operator of degree n” is a homogeneous polynomial of degree n in the generators. [1] [15])

 

Quantum Gravity

Let me, thus, briefly collect what we have found already to be of importance for Quantum Gravity:

  • Reproducibility needs Bell’s superdeterminism. This prevents a free will.

  • Finiteness yields an atomistic world with no (non-recoverable) singularities.

  • A complex Lie-algebra yields the duality between the 2 channels providing the coexistence of causality and entanglement.

  • Geometrical forces are the result of relating the secondary dynamic channel to the primary reaction channel.

  • Statistics, by their “law of great numbers”, are adding the macroscopic view of physics to its primary microscopic view.

Further implications are:

 

 

  • Triggered by the gradient of probability, motion, then, means hopping from one fixed-time slice to the next one.

Compare this with Wheeler-deWitt’s oversimplified trial not to solve the Gordian knot of our 2 channels not to be commeasurable with each other but to cut it by brute force by completely banning any time-dependence from theory. For, they correctly had found that an exact measurement of time trivially would have prevented time from varying – our slicing scheme with respect to time.

 

By their tacit identification of both channels, however, their model was doomed to end up in a disaster: Functional analysis is no good guide for handling the discrete quantum structure of physics! What they had been doing that had been throwing the baby out with the bath water.

 

Statistics provide probability, in addition. A normalisable probability (number of accepted cases divided by the number of all cases) needs a division operator. Now, number theory is teaching us that a system of numbers accepting a division operator has a dimension 2 to the power of n where n is not greater than 3. (For r-numbers n=0, for c-numbers n=1.) From experiment we learn that all 2**3 = 8 dimensions are needed, indeed. Hence:

  • The basic dimension of Quantum Gravity is fixed to be = 8.

In order to unify Planck’s world of quanta with Einstein’s General Relativity, the objective of Quantum Gravity is it to describe elementary particles and our cosmos by the same set of equations; just the values of their numerical constants should differ. Thus, a quantum gravity necessarily will contain external parameters, in addition, which itself cannot predict in advance. Our universe, hence, turns out to be some subordinate partial subsystem which – howsoever – will be embedded

Impressum

Verlag: BookRix GmbH & Co. KG

Texte: © 2019. All rights reserved. This is the English translation of the German e-book “Diracs Vermächtnis, Einsteins Dilemma, Zweikampf der Giganten“, BookRix, Munich (2019), ISBN 978-3-7438-9215-6.
Tag der Veröffentlichung: 02.01.2019
ISBN: 978-3-7438-9216-3

Alle Rechte vorbehalten

Nächste Seite
Seite 1 /