Does purpose arise from 'mindless math'? Humans are self-aware and aware of their surroundings,
thus conscious. The Darwinian Credo holds that consciousness emerges from increasing complexity.
The alternative is an inherently conscious, purposeful universe. How does one decide this issue?
The basis of physics is experience, so we analyze mind from this perspective.
Is the best path to understanding nature continued explosive growth of physics,
or carefully pruning false premises from over-specified models of reality?
I remove false premises from relativity,
and discuss the removal of false premises from quantum mechanics.
A theory's equations are designed to model physical behavior that reflects the nature of physical reality.
Einstein's nonlinear gravity equation is 'linearized' in the 'weak field limit' by ignoring nonlinear terms.
This can be misinterpreted as affecting the nature of the field.
Linearization is a mathematical artifice making equations easier to solve,
having zero effect on the physical nature of the field itself.
Thus it is false to say that the weak gravitational field is not self-interacting.
Nor is the weak gravitational field based on mass; the field equation is based on mass density.
These aspects of gravity are investigated by replacing curved space-time with mass density in flat space.
A novel quantum gravity relation is derived and related to quantum mechanics.
The near-century-old Stern-Gerlach experiment played an important role in the philosophy of quantum mechanics.
50 years ago Bell drew drastic conclusions about the nature of reality based on a model of Stern-Gerlach,
yet most detailed analysis of spin in nonuniform magnetic fields has occurred post-Bell.
Recent focus on work in non-equilibrium thermodynamics has potential significance
for quantum mechanics so we develop a spin dynamical analysis of work in an inhomogeneous field.
A small angle approximation analysis is performed.
We derive a novel Stern-Gerlach gradient threshold relation and
a decay rate for precession in a non-uniform field.
The theory is compared to a quantum analysis and an experiment to test this theory is proposed.
Viewing Math and Physics as Korzybski's 'map' and 'territory',
we analyze their trust-worthiness. Maps derived from observations
of the real world bring eigenvalue-based measurement into question.
But what to do when the map logic conflicts with our physical intuition?
This is often resolved in favor of the non-intuitive, whether simultaneity
in relativity or non-locality in the case of Bell's theorem.
The subtle nature of Bell's hidden constraints erasing the hidden variable
information is the basis of Bell's lack of trust in his intuition.
Bell oversimplified his model based on confusing a provisional precession
eigenvalue equation with Dirac's fundamental helicity eigenvalue equation.
I derive a local classical model based on energy-exchange physics that Bell
intentionally suppressed and I show that Bell's constraints determine whether
the model is local or non-local. The physical theory upon which the model is
based can be tested experimentally; if valid, Bell's claims of non-locality
will be proved wrong.
In "Quantum Spin and Local Reality" (QSLR) I show that Bell suppressed key
physical phenomena to arrive at his inequality. As a result Bell's conclusions
are incorrect — his model fails to match reality. Bell's defense is based on
quantum mechanical eigenvalue equations with reference to Dirac. I briefly
review some issues in the history of spin, and analyze the non-relativistic
Stern-Gerlach eigenvalue equation and the relativistic Dirac equation, and
show their relevance to Bell.
Almost a century ago Stern-Gerlach laid important foundations for
quantum mechanics. Based on these, Bell formulated a model of local
hidden variables, which is supposed to describe "all possible ways"
in which classical systems can generate results, but Bell did not
consider one possibility in which classical behavior leads to quantum
results. Bell buried the key fact needed to challenge his logic: the
ø-dependence of two energy modes: rotation and deflection. An
Energy-Exchange theorem is presented and proved: if dø/dt is
not equal to zero, the implied time-evolution will affect expectation
values and the essentially classical mechanism yields quantum
correlations -a·b. Analysis of the spin-component measurement
brings Bell's counterfactual logic into question. I show that Watson's
formal linking of time-evolution operator to measurement operation
addresses Bell's stated concerns about measurement in quantum
mechanics and produces the -a·b correlation. Our results,
restricted to particle spin, have wider implications, including
relevance to the ontic versus epistemic issues currently debated in
the literature. The suggested formalism extends beyond Stern-Gerlach
to other quantum mechanical processes characterized by a 'jump' or
'collapse of the wave function'.
History has shown that humanity works best when freedom is maximized.
A topic like "How Should Humanity Steer the Future" requires extreme
idealization, and something resembling a statistical mechanics approach,
leading to a thermodynamic model. Thermodynamics does work only when
a system has free energy. We link these different concepts of freedom
in this essay. Statistical mechanics treats large numbers of elements,
N, and total energy, E. The energy of labor, if not completely controlled
by force, is controlled by money, so we will measure energy in dollars.
This applies generally to the electrical energy one buys from a power
company or to physical work one does on a day-to-day basis. "Steer" is
a control concept implying a goal, therefore we formulate two idealized
goals and analyze their implications. To be relevant to reality, we address
two real goals, argued every day in the world, but simplified to allow analysis.
The question 'It from Bit' or vice versa is the question of what is real.
The answer is a matter of belief, so I analyze why physicists believe theories,
including QED and QCD and follow with the simplest possible theory of the real world.
I focus on the fact that gravity is real, and discuss a new approach to non-linearity.
Because Wheeler's 'It from Bit' is tied to his Participatory Universe I explore that
topic and a theory of information based on gravity.
Which of our basic physical assumptions are wrong? Superposition of quantum
states and collapse of the wave function are significant assumptions.
We address the physics of the wave function, the wave function as probability,
the extent of the wave function, quantum correlations, Bell's theorem,
spaces in which wave functions are formulated, and discuss recent experiments
that support our interpretation.
Is Reality Analog or Digital? Analog and digital mathematical treatments can be
shown to be equivalent, so the answer does not lie in math but in physics. At
root is the nature of particles and fields. The simplest
possible physical model, one field, will be analyzed and physical
experiments proposed to show an analog reality with digital consequences. There
are implications for the view of reality currently associated with entanglement
and violation of Bell's inequality.
Because every physical theory
assumes something, that basic assumption will determine what is
ultimately possible in that physics. The assumed thing itself will likely
be unexplained. This essay will assume one thing, a primordial field,
to explain current physics and its many current mysteries. The derivation of
physics from this entity is surprisingly straightforward and amazingly broad in
FPGAs and microprocessors are more similar than you may think.
Here's a primer on how to program an FPGA and some reasons why you'd want to.
Small processors are, by far, the largest selling class of computers
and form the basis of many embedded systems. The
first single-chip microprocessors contained approximately
10,000 gates of logic and 10,000 bits of memory. Today, field
programmable gate arrays (FPGAs) provide single chips
approaching 10 million gates of logic and 10 million bits of memory...
FPGAs enable everyone to be a chip designer.
This installment shows how to design the bus interface for a generic peripheral chip.
When designing with an embedded microprocessor, you always
have to take into account, if not begin with, the actual pinout
of the device. Each pin on a given microprocessor is uniquely
defined by the manufacturer and must be used in a specific
manner to achieve a specific function. Part of learning to
design with embedded processors is learning the pin definitions.
In contrast, field programmable gate array (FPGA) devices come to the
design with pins completely undefined (except for power and ground). You have
to define the FPGA's pins yourself. This gives you incredible flexibility but also
forces you to think through the use of each pin...
After decades of experimental validation, Bell's Theorem has changed the
ontological status of local realism in physics. But recent theoretical
and experimental results present a new challenge to Bell's analysis.
A geometric-algebraic challenge claims that Bell makes a topological mistake,
while 'weak measurement' results challenge the Copenhagen Interpretation.
We review these results and analyze the physics of Bell's Theorem,
embedding Bell's inequality in a truth statement, and showing it to be falsified.
At the root of reality is the nature of particles and fields.
The simplest possible physical model, one field, the gravito-magnetic field,
self-interacts to produce mass and charge and hence the electromagnetic field.
This paper focuses on the interaction of this gravito-magnetic field
with the electro-magnetic field.