\documentstyle[12pt]{article}
\addtolength{\topmargin}{-30pt}
\textwidth 430pt %380pt
\textheight 610pt %535pt
\renewcommand{\baselinestretch}{1.2}
\begin{document}
% Titlepage
\title{The Standard Model and Beyond:\\ Interrelation between Theory and Reality}
\author{Vladimir Slobodenyuk \\{E-mail: vslob@yandex.ru}}
\date{September 2002}
\maketitle
\begin{abstract}
Numerous attempts of theoretical physicists to create absolute theory ---
Theory of Everything, which are realized in models GUT, SUSY, supergravity,
superstrings, and superbranes, are subject to critical analysis. Attention
is payed to the fact, that no, even very exact, theories are able to
give exhaustive picture of the phenomenon described. Any theories give
only approximate models of the object, reflecting more or less number of
its features. But always such its peculiarities remains, that are not
taken into consideration in the model because of principle impossibility
of absolute exact and complete picture of the nature. Roughness of
tools available to human for description of physical world is displayed,
particularly, in essential defects of theories describing the nature
at the most fundamental level --- models of quantum field theory
(divergences, renormalizations etc.). Because the models pretending on the
role of TOE are not founded on any experimental data contradicting to
the Standard Model, then really they even cannot pretend on the role of
generalization of the Standard Model.
\end{abstract}
\bigskip
\noindent {\bf Keywords}: fundamentality, quantum field theory, renormalization, Standard Model, TOE, Theory of Everything.
\newpage
\section{Introduction.}
There were such moments in the history of physics when it seemed, that
one was to do only some last and decisive steps to understand the
organization on the nature, and then complete and absolutely exact
physical picture of the world would be created, in which all physical
processes would have got exhaustive description and explanation. In XIX
century such hopes were connected with classical theories: mechanics,
electrodynamics, and thermodynamics. At present some of quantum field
models generalizing the Standard Model may pretend, in opinion of a lot of
theoretical physicists, on the role of such theory. At last decades a
number of directions of such generalization was created. That are
Grand Unification Theories (GUT), supersymmetric models (SUSY),
supergravity, and, at last, superstrings and superbranes. Adherents of one or
another approach from this sequence mean, that one can create in its
framework the model describing physics of fundamental interactions
absolutely exactly and absolutely completely. Such model is called the
unified theory of fundamental interaction or Theory of Everything
(there exists even special abbreviation TOE). The last term testifies
the level of pretension of that theory.
To illustrate typical relation of physicists to unified theories
we reproduce statement of Devies (1985) about one of candidates on this
role --- supergravity. He says, that till now the physical theories were
considered as models, which describe reality approximately. But some
physicists assert now, that supergravity {\em is} reality itself, and that
this model ideally corresponds with real world. Then he states, that
similar to many tempting images, unified theory may prove to be
mirage, but first for the hole history of science our notions about
{\em complete scientific theory of everything} are arising.
The idea of construction of TOE is so popular, that some last decades
scientific interests of majority of theoretical physicists investigating
the problems of fundamental interactions are connected in any
way with working out the unified theory. One should note, nevertheless,
that in spite of increasing level of abstraction and complication of
the theory apparatus with appearance of every new approach, till now
there was no achieved any successes in more deep in comparison with
the Standard Model understanding of construction of the nature. Even the
fact, that efforts of leading theoretical physicists during so long
time have not led to any distinguishable result, make us to think about
causes of such failure.
In a present paper the attempt is undertaken to make more clear
interrelation between the theory and the objects of nature been described
by it in implementation to the quantum field models, and on this basis
to put boundaries of opportunities of these models in understanding the
phenomena of the nature and, respectively, to put restrictions on their
pretensions. One is to notify from a very start, that the article is
addressed preferably to theoretical physicists. Its main purpose is
to carry to them some ideas, which are rather simple for philosophers,
but which are ignored by physicists in their researches.
Before starting to discuss the problem outlined let us consider clear and
useful example. Assume we wish to describe geometrical shape of the surface
of the Earth. If we apply to history, then we can see the sequence of the
models, where every consequent one is more precise then previous. First
the Earth was imagined as flat (in global scale, without taking into
consideration relief of a place). From the modern viewpoint we can say,
that this picture may be accepted as zero approximation for description
of a small domain of a surface. But in global scale it becomes perfectly
incorrect.
Later it has become clear, that the Earth did have the shape of a ball. That
model was essentially more exact, but, as it has turned out to be, even it
did not describe the shape of our planet sufficiently good. More exactly
the shape of the Earth may be represented as an ellipsoid of rotation,
because it is flattened out in direction of rotation axis. But this is not
all. It proves to be, that the Earth is not symmetric relative to the
equator, but it has the egg-like shape. And one cannot find some simple
equation for description of such shape even in global scale, contrary
to the cases of a ball and an ellipsoid, for which the equations are well
known and have a simple form. But let us assume, that we have found some
equation, which quite satisfactory describes the shape of the Earth. Can
one hope, that the shape of the real Earth surface will not deviate from
this equation, even if we will take into account distances only of any
large scale and will not consider details of smaller sizes? Naturally,
one cannot. Even in large scales the surface has so improper shape,
that one is not able to describe it exactly with any even very complicated
equation. Note, that we do not mean here construction of simple level
lines directly according experimental measurements, as it is usually done
at geographic maps, but we mean creation of some mathematical model of
the surface shape permitting to calculate these lines.
Besides, there is once more cause which does not allow to describe the
Earth's shape exactly. Every description suppose presence of some scale,
such that the structures with sizes smaller then this scale are simply
ignored. Nevertheless, if we diminish this scale, then we are to consider
part of such structures. Other ones, which are smaller, will to be ignored
as before. But one can diminish the scale once more and introduce
into consideration new structures and so on. This process may be
continued infinitely. At any stage we will achieve microscopic scales.
But for these scales usual macroscopic notions loose their sense at all.
Moreover, at scales of order of atomic size one cannot say about static
picture and geometric surface. If we wish to obtain the model of the
Earth's shape taking into account local features of relief, then we are
to give up the idea of describing the surface by only one , may be,
very complicated equation. We have to use instead any interpolations
for separate domains of the surface, making ``sewing together" the
interpolation functions at boundaries of domains. Let the domain $A$ is
described by the function $f_A$, the domain $B$ --- by the function $f_B$
etc. And besides, the function $f_A$ itself is defined not only at the
domain $A$, but at the neighboring domains too, particularly, at the
domains $B$, $C$ and so on. The same is fair for other functions. But
their meanings at the other domains do
not carry any information about the shape of surface. So, the structure
of the model is such, that it includes a lot of functions describing
surface. But to obtain prediction directly for definite domain one
is to use only one of these functions.
The example discussed shows us that it is not possible to create
absolutely exact model of the Earth's surface. We can build even very
good model, but it will be correct only with definite accuracy and till
definite scales. It is obvious that one can do similar conclusions in respect
to any other object of the nature. The geometric shape of the Earth has been
taken here quite arbitrary only for the illustration.
We can build some models of the objects of phenomena investigated. These
models may be more or less rough. In the first case they allow us to
get predictions with more high accuracy and/or for more wide range of
phenomena or meanings of any parameters. But however, these predictions will
not be absolutely exact, because every even very complicated and refined
model is only some approximation to infinitely varied and complicated world,
which cannot be understood by human completely and finally.
After this affirmation it becomes clear, that there is no any sense to
try to create absolute theory, even if this theory should describe
construction of the nature at the most fundamental level. In any case
the model will be only more or less rough approximation to real world.
This statement is rather trivial from the viewpoint of philosopher.
But the problem is connected with following circumstances. Probably,
the most theoretical physicists do not realize this fact and consider
the models created by them as absolute models exactly reflecting
organization of the nature. Moreover, they often seriously think, that
the nature is organized according any equations, and task is only
to guess these correct equations and, correspondingly, throw away all
incorrect ones. But really the equations are only our human way of
description of the nature. So they on principle cannot describe the
nature exactly. They only allow us to obtain approximate estimates
for some parameters (maybe, many of them) related to the phenomenon
considered. But always there is something more, that is inaccessible for
human understanding currently and is not described by any models.
\section{Quantum field models.}
Let us apply to quantum field models the ideas discussed above.
\subsection{Divergences and renormalizations.}
Creating quantum field theory (see, e.g., Bogolubov (1980)) physicists
have run against the problem of divergences of the integrals, via
various observable values was calculated.
Nevertheless, it has turned out to be, that for some models one could carry
out the program of renormalization, which consists in following. One
can introduce any regularized functions, integrals of which are convergent,
instead of functions leading to divergences. These functions depend on
some additional parameters. When these parameters tend to definite
values, then regularized functions tend to their initial nonregularized
values (taking off regularization). After calculation of integrals with
regularized functions one is to carry out limit transition for the
parameters of regularization, i.e. take off regularization. But during
this transition infinite terms appear again. To eliminate them one make
renormalization, i.e. subtract infinite terms so that the result becomes
finite. Such an operation may be realized only for the models, in which
at all orders of perturbation theory only finite number of various
structures with divergences arises. If at high orders of perturbation
theory new and new types of structures with divergences appear, then
renormalization cannot be made and such models are nonrenormalizable.
Modern methods of quantum field theory does not allow to obtain any
sensible predictions with a help of nonrenormalizable theories when
the coupling constant is not small. Contrary to this, renormalizable
theories allow to calculate different values in the framework of
perturbation theory, and what's more, predictions for observable
physical values practically do not depend on the ways of regularization
and renormalization.
So, although renormalizations make the theory not so attractive
aesthetically and complicate a technique of operating with it, but they
do not make the theory unacceptable (if, naturally, it is renormalizable).
Infinite contributions and counterterms (terms that are to be subtracted
to eliminate divergences) in some features are similar with the functions
$f_B$, $f_C$ etc. at the domain $A$ from the example with the Earth's
surface. They have not physical sense, but formalism of the theory is
such that one cannot do without them. If one could define the shape of
the Earth with unified function, which is determined through out the hole
surface by the only one analytical expression, then one would not have
these excessive functions. But because one cannot do this, then one is
to agree with presence at the model these senseless objects. Situation
at quantum field theory is the same.
We do not know how to construct the formalism of the theory so that it
would not include divergences from a very beginning. So, we construct
the theory with divergences and then eliminate them according definite
recipe. Because such procedure allows us to obtain predictions, which
quite good correspond with the experimental data, then we have the model,
describing definite region of natural phenomena. If we do not demand
from the theory that it would be absolute and would give exhausting
description of the object investigated, then we may consider, that our
purpose is achieved.
Nevertheless, one should take in the mind, that this model, perhaps,
has essentially restricted possibilities for extrapolation. It describes
definite class of phenomena at definite conditions, and there are no any
foundations to think, that this model will work satisfactory at conditions
essentially different from ones, for which it was constructed. E.g., the
Standard Model is fair at the energies achieved at modern accelerators.
However, at the more high energies some objects and phenomena, which
should not exist according current notions, really may appear. Then it
will be necessary for their description to create the new theory
generalizing the Standard Model.
One may note here, that ``inelegance" of the quantum field theory connected
with renormalizations is, in the main, due to the restrictions of our
means for description of the nature. When more simple phenomena were
studied, e.g. subjects of following disciplines: classical mechanics,
classical electrodynamics, thermodynamics, then one have been able to
create quite elegant theories (which, nevertheless, have some defects too).
When one have come to more detailed description, taking into account
quantum effects, then one have become not able to construct the theory
been free of unnecessary structures, which was not connected with any
features of the phenomena described. It is similar to situation, which we
have met at the example with description of the Earth's shape. If for
global description of the shape one can take a single equation (it is
a ``beautiful" theory), then for description of the local structure
one is already to introduce a lot of functions, every of which has a
physical sense only within the range of its ``own" domain of surface
(the theory with ``unnecessary" structures).
\subsection{Standard Model.}
At present the Standard Model is generally accepted and quite good
experimentally tested (in the domain, where it permits to obtain
predictions) model of interactions of elementary particles. It includes
quantum chromodynamics (QCD) describing strong interactions of quarks
and the Weinberg -- Salam model describing unified weak and electromagnetic
interactions. Review of the Standard Model and gauge theories lying on
its foundation may be found in Halsen (1984) and Huang (1982).
Let us pay a special attention to electroweak model. Before the Weinberg --
Salam model electromagnetic and weak interactions were described by separated
theories --- quantum electrodynamics (QED) and Fermi model with four-fermion
interaction. Electrodynamics described (and describes now)
electromagnetic processes quite satisfactory. This is renormalizable
theory with sufficiently small coupling constant, that allows successfully
operate in the framework of perturbation theory. Contrary to this, Fermi
model is not renormalizable because of presence four-fermion vertexes
within it. It permits only to make calculations at first order in the
weak coupling constant $G_F$ (the Fermi constant). Due to weakness of
this interaction such calculations gave predictions quite corresponding
with the experimental data available. However, there was not any theoretical
justification for such approach. Standard quantum field theory demands to
take into account contributions of more high orders in perturbation
theory. But during their calculations divergences, which cannot be removed,
appear.
So, it was necessary to construct renormalizable model of weak interaction.
It turned out to be, that it is not possible to make the separate model
of weak interaction to be renormalizable. But it is possible to construct
the unified model of the weak and electromagnetic interactions, satisfying
the demand of renormalizability. In this theory carriers of weak
interaction --- intermediate vector bosons ($W$- and $Z$- bosons) are
introduced. They are massive particles (presence of their mass is
connected with
a finite radius of interaction). Due to these bosons the four-fermion
vertexes of the old theory are separated on two three-particle vertexes.
The Weinberg -- Salam model belong to the class of gauge theories,
all of which have general property --- renormalizability. But the
carriers of interactions in guage theories should be massless, while
vector bosons have a mass and moreover their masses are over 100 times
larger then the mass of proton.
To overcome this difficulty the mechanism of spontaneous symmetry breaking
(the Higgs mechanism) was suggested. It permits to start from the
exactly symmetric Lagrangian containing massless gauge fields and to
obtain afterwards in the result of spontaneous symmetry breaking the
Lagrangian with the massive gauge fields. Because the theory with
the initial Lagrangian is renormalizable, then it remains renormalizable
and after symmetry breaking. The pay for the obtained in a such way
renormalizability is appearance in the theory of the new hypothetical
particles ---
the Higgs scalar bosons. Namely due to the Higgs fields with their unusual
Lagrangian of self-interaction the possibility of spontaneous symmetry
breaking appears, together with renormalizability of gauge
theory. However, serious problems arise instead, which are connected
with problematical character of existence of the Higgs bosons in the nature.
Because of nonstandard in comparison with the fields of really existing
particles Lagrangian the Higgs fields seem to be rather artificial
innovations. There are serious reasons to have doubts that such intricate
mechanism as the Higgs is one really embodied in the nature as if to make it
possible for physicists to construct renormalizable theory. Besides, the
mass of Higgs particles is not fixed at the model. From one hand, it makes
difficult their experimental searches, from other hand, it allows to
consider this mass so large, that these particles cannot be born at
modern accelerators. This closes the possibility to refute the hypothesis
about existence of Higgs particles in observed future (i.e. the model is not
falsified, and this is essential defect itself).
Experimental searches of Higgs particles have not given positive results.
And this, naturally, was explained by large mass of this hypothetic
particles. However, the most probable cause of these results is
absence of such particles in the nature.
On the example of the Higgs mechanism in the Standard Model we have run
against the situation, when for description of real natural phenomena
the formalism have been created, which gave sufficiently satisfactory
description in one sector, but did not correspond to anything in another
one. Namely, the Standard Model quite well describes the sectors of
real particles: quarks, leptons, photons, $W$- and $Z$- bosons, and gluons,
but all things related to Higgs bosons are only some intrinsic structures
of the theory and do not correspond to any real phenomena.
Addressing to the example with the Earth's shape one can say, that the
sector of real particles in the Standard Model is analogous to the
function $f_A$ at the domain $A$, and the Higgs sector --- to the
function $f_A$ at the domain $B$. This function as mathematical object
exists at the domain $B$ too, but it does not carry any information
about structure of the surface at the domain $B$.
In principle, one could take only part of the Lagrangian of the Standard
Model with broken symmetry, in which Higgs fields are absent, and
consider it as the Lagrangian of the true theory. But such theory will
be nonrenormalizable and so one will not can operate with it in the
framework of perturbation theory. But one cannot effectively work with
quantum field theories in any other way. This is why one is to introduce
Higgs fields into the theory. But one should not be mistaken in respect to
its real status, and should not hope, that the nature is constructed in
correspondence with restricted opportunities of our mathematical apparatus.
Besides, there are other difficulties in the Standard Model. It is known,
that the series of perturbation theory are asymptotic, i.e. they are divergent,
but for the small coupling constant some first terms of the series give
sufficiently good approximation for the values calculated. If we take into
consideration a large number of terms of the series, the sum begin
strongly and strongly deviate from the truth value of the function, which
is represented by the series. In the Weinberg -- Salam model the coupling
constants are really small and so at this model one can obtain quite
exact predictions with the help of perturbation theory.
In QCD (other constituent of the Standard Model) the coupling constant
already is not sufficiently small and perturbation theory can be applied
only for description of very limited class of processes. It have been
introduced the notion of running constant. Its value depends on energy
transfer in the process. When large energy is transferred, the coupling
constant becomes quite small to calculate with a help of perturbation
series.
For processes with low energies the coupling constant is large and
perturbation theory is not applicable. There is no any regular theory
founded on the first principles of quantum field theory in this region.
Even in such models that, as it is asserted, are founded on the first
principals of QCD (e.g. lattice QCD, QCD sum rules) really some additional
suppositions are used. So, they cannot be treated as simply other methods
of working with the same theory. In realty they are {\em other} models,
different from perturbative QCD, and they cannot pretend on the same
power of generality as QCD can.
So, QCD in its modern state does not permit to obtain the predictions
for the most of processes, which it has to describe and which are connected
with so called nonperturbative effects. In particular, the process of
creation of hadrons from quarks are not described theoretically.
Physicists usually represent this situation as follows. There is the theory
--- QCD, which on principle gives complete description of strong interactions.
But we cannot make predictions for low energy processes with its help not
because the theory does not contain this information, but because we yet
cannot work with this theory in proper manner. Such a viewpoint is not
irreproachable, because if the regular ways of extraction of the information
about nonperturbative effects suddenly will be found, then there is no
any warranty, that these predictions
will be corresponded with that ones, which are being obtained now in the
framework of perturbative formalism. On principle, it may occur that
nonperturbative theory, founded on the same Lagrangian as perturbative one,
will have essentially different properties and will not be adequate
to existing experimental data.
This may be caused by following fact. Formalism of perturbative quantum
field theory is adapted exclusively for operating in the framework of
perturbative theory. But properties of the theory are due to not only
its Lagrangian, but to features of its formalism too. It, probably, will
be necessary to change the formalism itself for nonperturbative approach.
And this will lead, in fact, to construction of new model although founded
on former Lagrangian.
So, the Standard Model seems as some quite satisfactory approximation for
some sufficiently wide class of natural phenomena. But nothing more.
Defects of its formalism (impossibility to operate beyond the framework of
perturbation theory, renormalization, divergence of perturbation series,
necessity to introduce unbroken gauge symmetry, when it is not exact in the
nature, appearance of Higgs bosons at the theory, when they are absent in the
real world) do not allow us to hope that beyond the range of phenomena, for
description of which this theory have been constructed, it may give any
reliable predictions.
\section{Beyond the Standard Model.}
Outstanding success of the Standard Model in unification of weak and
electromagnetic interactions has inspired physicists for further search
of possible unifications of fundamental interactions. Investigations
connected with looking for the more high group of symmetry of gauge theories,
which would include as subgroup the group of symmetry of the Standard
Model $SU(3) \times SU(2) \times U(1)$ have begun to develop intensively.
This symmetry of high level is treated as exact one. At any very high
energy it is broken by means of the Higgs mechanism till the symmetry of
the Standard Model. Such models are called Grand Unification Theories.
Review may be found in Ross (1985).
It was proposed a number of models from this group. All of them
had any defects. But the matter is, that all these models, according their
construction, have all defects characteristic for their prototype ---
the Standard Model. Even this circumstance does not allow for these
models to have the status more high then the Standard Model has. However,
at times of boom of researches related to this approach many physicists
did hope that one of such models may occur correct model of the world,
describing it on fundamental level and, besides, absolutely exactly.
Later new directions of searching of TOE were arisen. These are
supersymmetric theories, supergravity (see, e.g., Ferrara (1983)),
superstrings (see, e.g., Green (1987)) and different
types of branes. The key idea of all of them is supersymmetry, i.e.
symmetry between bosons and fermions. It is supposed that there exist
hypothetical supersymmetric partners --- fermion for every really existing
boson, and boson for every fermion. E.g., photino is partner for photon
with spin 1/2 (photon has spin 1), selectron is partner for electron
with spin 0, squark is partner for quark with spin 0 (electron and quark
have spin $1/2$) and so on. Thus, according to these models, it
should exist in the world at least twice more of fundamental particles,
then we know. Where is the second half of the world? As no any
supersymmetric partner was observed till now, then it was supposed that
their masses are very large and power of modern accelerators is not
enough to produce these hard particles. I.e. here the same trick is used,
which has been used earlier for justification of absence of experimental
data about existence of Higgs particles in the Standard Model.
The most refined approaches, related to supersymmetry, is superstrings, and
besides, their generalization on the objects with more then one spatial
dimension --- branes. The fundamental objects in the theory of superstrings
are the one-dimensional lengthy objects instead of point-like particles ---
superstrings, possessing the fermionic degrees of freedom besides usual
bosonic ones. It is considered, that fundamental frequencies of these
strings' oscillations correspond to definite elementary particles.
This is characteristic of the place of this theory at modern physics
given by I. Aref'eva and I. Volovich (1990). ``Superstring theory is
the modern version of unified theory of fundamental interactions, sometimes
it is called the Theory of Everything. Aspiration to create such theory
penetrates all the history of science --- from Pythagoreans to Newton
and Leibniz, and further Reimann, Weyl, Einstein, and Heisenberg. It is
not obvious, that any successes are possible on the way to this purpose
at all. So much the striking the remarkable achievements of the superstring
theory seem. There are calculation of dimension of the space-time$^1$),
fixing of definite gauge group, including into unified scheme the
gravitation theory, and, probably, absence of divergences. The superstring
theory touches the most deep problems of the universe, and is the most
advanced current approach to the questions about nature of the fundamental
interactions. However, in spite of great interest to the theory and
remarkable achievements, one is to say, that basic problems remain opened
here. From the physical viewpoint the main problem is absence of the
experimental predictions. We may hope to enclose all information about
elementary particles into the superstring theory, but the superstring theory
itself have not given any experimental predictions. From mathematical
viewpoint the problem remains to prove the absence of divergences even
in the framework of perturbation theory".
We have cited so long quotation, because the position characteristic for
the most of physicists studying the superstring is expressed here in
concentrated form. We see that they treat the superstring theory as a real
candidate for the role of absolute theory of everything (the possibility
of construction of absolute theory is implied by them as going without
saying). However, this theory does not give any new experimental predictions
(in comparison with the Standard Model), and, correspondingly, the
experimental data available do not demand for their interpretation any more
general, then the Standard Model, theory, and do not give any foundations to
choose one or another way of generalization.
What is the basis for belief of theoretical physicists that namely
supersymmetric theories may give exact picture of the world? Perhaps, there
are two the most important points: a hope to create the theory unifying all
four fundamental interactions and a hope to get rid of the divergences
usual for standard quantum field theory. It is known that there are no
divergences in one-loop approximation in the superstring theory, and
there are serious foundations to hope that divergences do not appear in
higher orders of perturbation theory too (but this fact is not proved till
now).
So, these reasons have quite speculative character and do not connected
with an experiment in any way. Here researchers attempt to create the general
theory with very cumbersome and complicated formalism, which does not
give in experimentally achieved for today range any predictions
different from given by partial theory --- the Standard Model. Usually hopes
are connected with a launch of new accelerators of elementary particles for
more high energies and discovering the new physics with their help.
Namely then, as adherents of this approach consider, predictions of
supersymmetric models will be tested! However, existence of such hopes is
due to overestimate of human ability of cognition of the nature without
any experimental premises.
To make situation more clear let us apply once more to the example with
description of the shape of the Earth's surface. Assume, that we have
chosen the accuracy, with which we desire to describe the surface
(let us determine it as minimal distinguished length $d$), and have
obtained on the basis of experimental measurements the set of the functions
$f_A$, $f_B$ etc., describing the domains $A$, $B$ etc., correspondingly.
Then we have set our mind on getting description with more high precision
$d'$ ($d'