DOC.
27
DISCUSSION OF DOC.
26
427
attributes
to
the
fact
that
in
the
overwhelming majority
of
cases a
state
Za
is
succeeded
by
a more
probable
state
Zb.
From
among
all
the
states
Zb,
Zb',
Zb",
etc.,
to which
Za
can pass
in
the
very
short time
t,
the
state
Zb
will
practically always
appear,
because
it
possesses an
enormously greater probability
than the
state
Za
and all
of
the
other
states
Zb',
Zb",
etc.
Thus,
the
apparently
unidirectional
succession
of
states
actually
consists
in states
of
ever greater probability
following
successively
upon
each
other.
But such
an
argument
gains
some measure
of
persuasive
power
only
when
one
has
made clear
what
is
to
be understood
by
the
"probability"
of
a
state.
If
a system
left to
itself
passes
in
an
endless
succession
through
the
states
Z1
...
Zl
(in
the
most
varied
sequences),
each
state will
possess a
definite
temporal
frequency.
There
will
be
a
fraction
z1
of
a very long
time
T,
during
which
the
system
will
be
in
x
the
state
Z1; if,
for
large
T,
-1
tends toward
a limiting value,
then
we
call this
the
probability
W1
of the
first
state,
etc.
Thus,
the
probability
W
of
a
state
is
conceived
as
the latter's
temporal frequency
in
a
system
left
to
itself
an
infinitely long
time.
From
this
point
of
view,
it
is noteworthy
that
in
the
overwhelming majority
of
cases,
if
one
starts
out
from
a
specific
initial
state,
there
will exist
a
neighboring
state which
the
system-if
left to
itself
an
infinitely long
time-assumes
more
often than
it will do
other
states.
But
if
we forgo
such
a
physical
definition of
W,
the
statement
that
in
the
overwhelming
majority
of
cases
the
system passes
from
one
state to
a
state
of
greater
probability
is
a
statement
devoid of
meaning,
or-if
one
has
set W
equal
to
an
arbitrarily
chosen
mathematical
expression-is
an arbitrary
assertion.
If
W
is
defined
in
the
manner indicated,
then
it follows
from the
very
definition that
a
system
left to
itself
in
an
arbitrary
state
(and
isolated
from
without)
must
assume,
in
the
majority
of
cases,
successive states
of
ever greater probabilities,
and from
this it
follows
that
W
and the
entropy S
are
connected
by
Boltzmann's
equation
S
=
k
log
W
+
const.
This follows
from the
circumstance
that the
probability
W-insofar
as
the character of
a
unidirectional
flow
of
events
is
maintained
at
all-must
always
grow
with
time,
and that
there
cannot
be
a
function
independent
of
S
that
has this
property
at
the
same
time
as
S.
That the connection between
S
and
W
is
exactly
the
one
given
in
Boltzmann's
equation
follows
from the relations
Stotal
=
E
S,
Wtotal =
R(W),
which
hold
for
the
entropy
and
probability
of
states
of
systems
composed
of
a
number
of
subsystems.
If
one
defines W in
the
manner indicated,
as
temporal
frequency,
then Boltzmann's
equation
contains
right away
a
physical
statement.
The
equation
contains
a
relation
between
quantities
that
are
observable
in
principle, i.e.,
the
equation
is
either
correct
or