The Universe is flat!

Covering the basics
At the beginning of it all, there are the first principles and axioms.
These are used to define the basic equations subsequently verified by experimental data.
In some cases, experimental data and first principles go hand in hand with the reality we observe and experiment upon.
When this happens then we have things that work like radio communication, power plants, objects heavier than air floating above the air, satellites orbiting Earth or sent to other planetary bodies in our solar system, and so on and so forth.

Of course it wasn’t always like that at the beginning. Not too long ago, we believed the Earth was flat, or that everything would revolve around it since God placed us, the Man, at the center of the Universe.
These were the axioms back then and sure enough we were able to predict the trajectories of other planets in our solar system by means of extremely complicate equations (except for the Moon and the Sun of course).

Eventually new axioms took over and placed the Sun at the center and every other planet (other than the Moon) orbiting around it. This is when the equations of planet positions became extremely simple as opposed to the previous calculation methods.

The transition between the old ways and the new ones wasn’t exactly a walk in the park for Galileo and other heretics of that time. A lot of opposition and commotion ensued but ultimately we got rid of the old ideas and embraced the new axioms.

Occam’s razor
We eventually moved because of Occam’s razor, which is yet another way to say that humans are lazy, and above all their mind is, so they are always on the lookout for making complicate things easier for them, also in the domain of math and physics.

Now let’s be clear, there is no such a thing as an ultimate physical truth out there, but there are things and models that are convenient and workable enough for us to build machines and improve overall quality of life, hence they are repeated and studied over and over, a prized and valuable possession of human culture.

In defense of the old, we are not 100% sure whether the Earth is round, or flat or a cube for that matter, but we do know that to put and keep communication satellites in orbit we’d rather use spherical gravitational equations and Newtonian and Galilean first principles formulations.

Surely we could derive and use flat Earth equations to achieve the same feats and I can assure you the complexities of said equations would be quite challenging and so prone to errors and mishaps to make rocket science and space exploration unpractical by all means.

In the end, science progresses because of mental laziness of the scientists and mathematicians, and also because the progresses built upon the previous science eventually hit against a wall of discrepancies and inconsistencies and predictability of experimental data goes off the chart and we are no longer able to innovate or build newer and smarter machines.

First Principles and Second Principles
As an example, there is a big difference between stating that Earth orbitates around the Sun with a period of 365.25 days (Empiric second principle) and stating that Earths moves in accordance to to the law of gravity subjecting it to a centripetal force equal to F = G x m(e) x (m(s) / (r e-s)^2…

The second formulation is not just a DATUM (a matter of fact observation), it opens up a whole new dimension of possibilities in terms of ballistics, placing satellites in orbit, sending men on the Moon, etc, all sorts of things that the original plain (empiric DATUM) observation could have never achieved.

Even though the gravitational law is a big improvement from medieval age astronomy, it still has lots of arbitrary parameters such as the constant (?) G, and the underlying assumption that gravitational and inertial mass are one and the same at all time, but still it helps unlock new technological possibilities. Eventually scientists crack down the formulas even more and reduces the number of DATUM parameters that datafit observations toward a decreasing number of parameters, until eventually only one or two survive?

In this regard the cosmological standard model or the nuclear scattering models are superb examples of models based completely on arbitrary datafit parameters that do match the observations but cannot predict any new experiment or propose technological applications.

Moving on with new theories
We have established before the basic hierarchy of the mind mastering the underlying fabric of reality through First Principles and subsequently imposing its will over matter through technology, to make reality do what the human Mind wants.

What happens when the experimental data disagrees with the theory?
In this case a lot of funny things happen to the ingenious mind of physicists.
On one side, older minds (we can call them “the old guard”) will try to salvage and prop up whatever formulation or principle can be salvaged through adjustment factors conjured ad hoc into the original First Principle equations in order to data fit the results and reconcile them with the altered formulas.

This is again an efficient, lazy mind, developing processes to avoid having to re-learn a new physic, assuming a new Grand Unifying Theory is at hand at all.
It is also a matter of pride, because physicists do not necessarily like to admit they were wrong, or that they have squandered billions of dollars on research and experiments that have failed to produce any significant application or advance in living standards for decades.

When your original first principles have reached a technological limit to advances, and the Old Guard does not want to look out at novel formulations and vistas, you start having all kind of madness comparable to Flat Earth beliefs.
Some well-informed claim these scientific scams will be referred to as the Flat Universe beliefs, in the sense that we have indeed ascertained the roundness of the Earth, but our current mainstream academia is all but adequate to provide a solid understanding of the true working principles of both our micro and macro cosmos.

Eventually when one day the New Guard comes about and a deeper and simpler understanding of New First Principles are established, then all the science of our last 60 to 70 years will look indeed as though they were trying to teach us all that “the Universe is flat”!
What follows is a list of the most blatant scientific scams that might eventually come to pass down in scientific history as madness.

Imaginary particles part 1: Quarks
Firstly we must recognize that quarks provide a simple and effective formulation and organization of the particle zoo we observe, a sort of Mendeleev table but for the subatomic world. They do have an utility to explain and organize ideas and teach students.

Aside from that, it is somewhat of an embarrassment the sustained belief that quarks are the fundamental constituent of matter.
First thing, they have never been spotted in the wild, not even in the wildest and highest energy experiments conceived thus far.

Also an embarrassment the fact that in theory they should be quite common to be observed all the time, because they are crammed in triplets inside protons and neutrons, and as we know, particles have a certain uncertainty associated to their position within a potential barrier, nevermind staying in the same spot or position in a triplet configuration for prolonged periods of time or forever since to date we have never observed a proton naturally decaying into its base quark constituents.

Protons and neutrons made out of different quarks, one shape shifting into the other by addition or subtraction of an electron.

Nevertheless, a great amount of literature and mathematical brain power, conjuring of gluons, etc is consumed in the idle task of explaining why the decay probability of stable hadrons is zero or otherwise incredibly low in blatant defiance on Heisenberg indetermination principle, and the entire discussion narrowly resembles the mediaeval debate of how many angels can dance on the head of a pin.

It is safe for us to state that to date (AD 2024), no useful or whatsoever impactful application has been derived off the study of quarks other than the pockets lineing of professors and academia who derived their living out of this line of research, and yet academia has been deeply invested in this theory and it would be an embarrassment to rewrite schoolbooks and Wikipedia pages to explain the conjuring of Quarks in the physical reality was just a fad.

Imaginary particles part 2: Neutrinos
These particles are of exceptional interest to explain the mental process through which correction factors created to prop up inadequate theories, eventually become New Particles or Forces of Nature.
Back in engineering school they taught us how to rack up and group together all other secondary microscopic processes and inefficiencies causing deviations from predicted behaviors under the umbrella of a general efficiency or correction factor, also called by the teacher as an “ignorance factor” or less ceremoniously among the students as a “fudge factor”.

The issue with elemental particles is that they are perfect and have no space for fudge factors and approximations, so whenever a particle physic experiment shows without a doubt that there is a small mass amiss or that the magnetic moment of the by-products doesn’t add up, then this fact is not because our particle theory is incomplete, but because the error factor of this experiment is now grouped under a generic fudge factor which eventually was named neutrino.

In fact this wonder particle is a miracle worker of the mainstream academia as it comes in all sort of shapes, sizes, energies, colors, masses, strangenesses, it can change its properties in flight, and so on and so forth, in such a way that a broad swath of experimental results can be corrected and the data fitted by the “neutrino” messing with the nuclear process being observed.

Since academia has established unanimously (?) that neutrinos are no fudge factors but real and true particles, the challenge is on to properly detect them.
To be fair, this particle has never quite been detected in the wild, or more precisely, its detections have been so rare and potentially be due to other secondary factors in the detection process that many in the academia itself have serious doubts about its existence in our physical reality.

One objection to neutrino detection is the possibility of detecting secondary nuclear decays like deuterion decaying into monoatomic hydrogen or the like in the sensor materials, other theories call out for micro anisotropies in the aetheric substratum as will be discussed later on this article.

Still, a great many tens of thousands of pages and mental power have been heavily invested upon the neutrino conjecture to convert the error factor of our theory into a particle that data fits the experimental results, so we can save the older theory without too many revisions.

Admitting that there is something more fundamentally wrong with our theory will force many of the Old Guard into early retirement, many textbooks will have to be completely re-written, and many Nobel Prizes will have to be recalled, or more importantly, the prestigiousness of the Nobel Prize itself will be questioned since other more valid theories were around when the Old Guard was still bagging these prizes.

It is not polite however, to dismantle a theory without proposing something new to maintain a constructive debate so I shall relaunch the theories discussed in other posts of ours which have been hiding in plain sight of the scientific community for few decades.

Nuclear fusion for dummies
We will start with the simplest type of fusion, the one of the electron and proton coming together to make a neutron. This is a pretty common process, it happens all the times inside stars, so when you bask in the summer Sun you should be thankful for this process!

In the conventional, mainstream physic community, the electron and proton simply magic themselves out of this physical reality, and in its place a new particle, the neutron, is conjured into existence.
This byproduct particle has a total mass higher than the sum of the two starting masses, it has a deviant magnetic moment compared to the sum of the original particle constituents, and this fusion process also leaves some pocket change thermal/kinetic energy that makes the Sun hot, which is what scientists wants to use to power up steam turbines here on Earth through controlled nuclear fusion.

Unless this neutron is bound within a stable nuclei, then sooner or later it is bound to magic itself back into an electron and a proton plus the excess mass it gained seemingly out of nowhere during its formation as kinetic energy of the expelled electron, plus a “neutrino” which is needed to balance out the missing magnetic moment form the byproduct particles generated in the neutron break up.

This is the current mainstream representation of a fundamental physical process, and it suffers some hopeless flaws.
Firstly, the positive binding energy between the electron and proton making up the neutron mass difference of 0.782 MEV is a big unresolved issue since current equations cannot account for it, not just for the neutron but also for higher fusions processes with heavier nuclei at large.

Secondly the magnetic moment does not add up to begin with, nor it does when the neutron goes back into its fundamental constituents (electron and proton), but on the way back at least we have the “neutrino” showing up in the equations to save the day and fit the observed experiments, even though it has barely been observed in experiments designed to detect it.

Thirdly, the scientific community is in itself more or less openly debated whether the neutron is in fact a fundamental particle of its own (and yet it decays?), or is in fact the sum of two TRUE elemental particles, the electron and proton.
Those in favor of the neutron being a particle of its own will find solace in this assumption as it would help to save the current model and we can just say that sometimes the sum of two things can get you more than what you bargained for. Just accept that and be thankful for it because you get to get tanned in summer!

Those who believe the neutron being a composite particle made of an electron and proton are stuck in trying to explain the excess bits and bobs missing from the equations but present in reality.

Even then, as of today (AD 2024), stars are pouring tons of energy into existence out of nothing all the time, whether it is thermal energy or excess mass of the fused nuclei. The lack of explanations for these big free energy fountains called stars is a bit of an embarrassment that no one can explain out of First Principle equations within the Old Guard.

Explaining the Neutron for real
For all the earthlings that do not know this, neutrons are also a prized particle in a number of scientific experiments and pretty useful in nuclear fission reactors, but this particle is also a fairly expensive to produce through things like heavily radioactive materials, thick steel containers with military escorts to site of use and superexpensive disposal procedures, or sometimes even through multimillion dollar particle accelerators booking their produced neutrons and experiments to the highest bidder with waiting lists as long as Santa’s beard…

The explanation of all neutron properties, including the excess mass and the magnetic anomaly has been explained decades ago, courtesy of hadron mechanic, and this eventually led to some true machines with true impact on human science and technological welfare, such as the possibility to synthetize neutrons on the flick of a switch and some plain hydrogen gas at few psig thus dramatically reducing production costs and safety concerns, because the neutrons are stored in the harmless hydrogen gas form and only produced when needed by the operator.

In this picture we show how the electron and proton first align themselves over the lines of a strong magnetic field produced locally by nearby charged particles inside the plasma gas of the stars or suitably designed hadronic reactors.
This allows the two particles to align themselves on their polar axis as opposed to their magnetic equatorial plane which notably prevents the two from coming too close and fuse spontaneously in normal conditions.

Once the electron and proton trip into each other magnetic poles, they fuse into a state of partial mutual penetration, but since the electron is much lighter than the proton, the electron will also orbit around the proton axis, a bit like the Moon does with the Earth, and this additional orbital moment and charge movement accounts for the missing magnetic moment of the neutron.

This theory is revolutionary because it explains the neutron properties purely out of First Principle axioms, and not through adjustment data fit parameters to fix the formulas and chase experimental results.
Even more revolutionary the depiction of particles like spheres with a volume and a surface, and the possibility for these to undergo DEFORMATIONS as opposed to immutable point like entities which is considered a blasphemy by the Old Guard.

Here the total magnetic moment of the neutron is the sum of the proton, the electron, the orbital moment of the electron spinning around the proton in a state of partial mutual penetration, plus a small component given by the fact that proton and electron in mutual penetration undergo a small deformation from they sphericity (blasphemy according to the old guard!) and this induces a small deviation from their otherwise ground state spin in perfect vacuum.

Another interesting feat of the theory is that if on one side we have the universe or Aether pouring mass and pulse energy on this physical reality, somewhere else the Universe must be pouring a similar amount of Anti-mass and negative energy somewhere else to balance a total universe state of ZERO total mass and ZERO total energy, thus providing completely novel cosmological vistas and explanations.

Imaginary stuff part 3: The case for dark matter, dark energy and the lack of thereof
One of the most troubling developments of mainstream academia is the conjuring of Dark Matter and Dark Energy to explain the senseless astronomical observations of today’s astrophysics.

Since the light we see from faraway stars is red shifted the more distant a star or galaxy is from Earth AND the same must be true for any other observer somewhere else in the Universe (unless Earth is a statistical oddity sitting right at the center of creation like in the medieval ages!) AND the redshift can only be explained by the relative speed of the emitting stars speeding away from the observer at ever greater speeds with their distance THEN it is implied that every galaxy is exponentially expanding away from any other point of the Universe, with subsequent breakdown of basic symmetries and geometries because velocity vectors simply cannot add up or make any sense at all.

In order to reconcile this bizarre fact, we subsequently conjure the existence of something we cannot observe or experiment here on Earth (oddly similar to the neutrino story above) and we say that all is OK and space is stretching away by means of mathematical virtuosos that are called Dark Matter and Dark Energy.

The Earth sits at the center of this artifacts and the pins are galaxies speeding away from us. But every pin is also an observer and should see the same thing, the other galaxies on his sphere of observation speeding away from it radially and exponentially.
Then everything is speeding away from everything else everywhere in every direction, but the velocity vectors cannot add up like this and still make geometrical sense. Something fundamental is missing in our equations and observations.
Alternatively the Earth is just a statistical oddity truly sitting at the center of creation like in the Medieval ages since God himself placed it there in the middle of Everything, and for no Einstenian reason at all, gravity just so happens to reverse its sign above certain distances because of… uhm… Dark Matter or whatever.

At the beginning even Einstein was baffled by this and he had to introduce the Cosmological Constant out of thin air as no first principle theory or experiment could explain the why of this apparent drift of galaxies away from everything everywhere but then, it was a simple term and constant so astrophysicists had to limp along with a force that goes backward compared to gravity and no other experiment to understand this.

Unfortunately more recent measurements like differential redshifts of light coming from edges and center of galaxies or from quasars suggest the presence of even more complex clusters of dark energy and matter surrounding galaxies in all kind of kinky ways, with even greater mathematical challenges to reconcile these bizarre results.

This has some deep implications within the scientific community because, well, it throws Einstein and its wobbly cosmological constant under the bus since we now have not just a force of gravity that works completely backward, but it also likes to cluster in even more complex ways and play even more tricks far away from a constant.

So once again physicists painted themselves into a corner where they cannot explain observations, and to data fit the experiments they now need to demolish about one century of tradition and Nobel prizes associated with it, but no creative new experiments to back the new formulas up.
The patch is worse than the tear.

The tired light comes to the rescue
It is not constructive to point at the problem without offering a solution.
This simple experiment and subsequent paper done by the usual suspects was again achieved a while back and available to true truth seekers out there.
It throws away all the dark matter and dark energy nonsense without impacting Einstein relativities too much or better saying it salvages the core stuff that works whilst it discards some of the things that won’t work in the New Physic paradigm shift.

The redshift or even blue shift of light can be caused by the relative speed of the light emitting source but it is not the only way light can shift its frequency and energy.
Back in the day there was the tired light theory, more recently quantified and explained by means of First Principle axioms with the prediction that photons can “shed” and trade some of their energy whenever they are not travelling in full vacuum.
Depending on the media density and temperature and the travelled distance, a photon can lose some of its energy (iso red shift) or gain some energy (iso blue shift) with nearby atoms of the media it travels through.

Indeed stars and galaxies out there do have some relative speed compared to us which does indeed impinge a small amount of red and blue shift into the light we detect, but the biggest amount of red shift is because photons shed some of their energy to the rarefied gasses of the interstellar media and they do more so as the distance increase, hence the further away the star, the more the light gets tired and has lost energy to interstellar gasses during the trip.
In this case the astrological constant would be nothing but the integral of the average interstellar gas density and temperature causing the light to tire at a certain rate per light year travelled.

You don’t believe it?
Then try take a 50 ft straight pipe, put two sight glasses on each end.
On one side you put a few watt diode beaming its light through the first sight glass and on to the other end of the pipe.
On the other side you put a spectrograph to measure the incoming light frequency emitted from the diode on the opposite side of the pipe.

Now if you change the air density (pressure) and temperature inside this pipe, a fun fact will happen: The light from the diode will redshift or blueshift.
If this experiment is indeed true, then is safe to assume that Dark Matter and Dark Energy were just figments of the Old Guard imagination, since we now have a more comprehensive theory capable of explaining the red shift and blue shift measured out there in a very predictable and measurable First Principle way and in line with what is being observed.

Imaginary forces part 4: The valent bond
As we saw before in the case of bond state between proton and electron, there is another well kept secret in quantum chemistry, namely the fact that two electrons, who electrically repel each other quite strongly, can nevertheless stick together even more strongly within the valent bond with little first principle axioms to quantify this fact.

Once again we need to embrace something new, and see these two electrons as two spheres in a state of partial mutual penetration. This so called “surface tension” between spherical particles is what causes a whole new array of non-Hamiltonian (non-conservative) forces and energies as briefly explained before in the neutron synthesis.

In general this figure representing two electron in valent bond will cause a serious fit with mainstream physicists because electrons are points and not spheres.

Politics in the academia
The fact that particles should be points and not spheres is quite a fundamental axiom among the Old Guard since admitting the other way would require quite a few books to be rewritten and a lot of old stale theories to be discarded.

Theories suggesting otherwise, AKA the New Guard, are currently relegated to the purgatory of fringe science but even then, experiments do occasionally stumble upon apparently inexplicable things that call for a deep revision of our current axioms.

A most interesting case is the one of the Quebec-Berkeley-Bonn experiment back in the 1980 that managed to alter the otherwise immutable magnetic moment of a neutron by impinging a deformation on its otherwise perfect point like sphericity.
The group of researchers submitted their experimental results and paper to the APS journals for publication, and indeed it was published about 1.5 years after its submission but with a note at the end saying that the measurement was disproved by another more recent paper from the Los Alamos group (Ref Physical Review Letters, Hardekopf, Keaton, Lisowski, Vesser C25 1090/1982).

Such a long publication time is quite suspect in general, even more suspect the fact that on the same issue also another disproving paper was published, and the fact that the journal editor automatically assumed and judged that the former paper must be wrong and the latter right.

In a nutshell:

1) We have one experiment highlighting a quite astonishing result that highlight some fundamental issues with our current physic.

2) It is not published immediately but with a long delay.

3) Meanwhile, thanks God, another group of scientists (?) rushes to repeat the experiment and disprove the results of the first experiment, just in time when the first paper is also published.
What a perfect timing!

4) The APS Journal publishes both papers, the one that was quarantined for 1.5 years and the more recent disproving the first. The editor then elevate himself to referee and establishes the first one must be wrong because disproved by someone else.

5) So nothing to see here folks, move along, all is well with our current physic.

In an ideal world, the resolution of the dispute should have been conducted with the repetition of the experiments and the side conditions carefully examined by both groups to converge toward the Truth of the matter, which is what the QBB group proposed to the Los Alamos scientists.

Regrettably the Los Alamos group had more pressing matters to attend to and for them the issue was settled as the results were in line with current theories.
Nothing to see here, no need for deeper investigations on such a fundamental matter…

The QBB group eventually had their paper and subsequent experiments re-confirming the original results published in Europe and other countries.
So depending on where you live, the neutron could be a point and the neutron could also be a sphere, the Earth is flat but in certain regions of the world it could be round like a neutron.

The Grand Unifying Theory
The more fundamental problem of current physic is that its two pillars, the Quantum mechanic and Einsteinian relativity have profoundly incompatible mathematics, which means the current experimental and theorical landscape is a patchwork of theories and formulas with limited general applicability.

Current physics made by a patchwork of limited applicability formulas for limited experimental results are missing the greater picture of Reality.

As a result, the experimental results lead with the data and the formulas lag with predictability, they can no longer see or understand what happens exactly in the real world, therefore no new machines and applications can be derived of these unfruitful lines of research.

The pattern is a bit like follow:

1) Science makes some useful observations about the world and Nature out there.

2) New machines and technologies are built on these scientific observations and First Principle formulations.

3) Eventually science and technology reach maturity and there are new phenomena and effects that are reported but unexplained by current theories.

4) The Old Guard clings on to the old theory and tries to salvage what it can by means of censoring new lines of research or by data fitting old formulas with fudge factors now becoming new non-experimentable particles and forces of nature.

5) The human mind, being lazy at its core, eventually does develop a new physic that produces new machines that bring simplicity, predictability and repeatability of the unorthodox scientific observations.

6) Eventually the Old Guard dies out and space is available for the New Guard to take over and with it a new era of advance in science and technologies…

…until the cycle repeats itself once again!