Jump to content
Science Forums

The Fiction Called: Cause and Effect.


genep

Recommended Posts

The greatest geniuses in Science tell us with absolute certainty that mostly their Gravity tells them that at least 95% of the Universe is missing in the Universe's Antimatter, Dark Matter and Black Holes. They have to tell us this because they don't know that literally everything, besides Gravity, keep screaming at us that much more than 99% of the Universe is scientifically Hidden, Really.

 

Even if Science's only 10% of the Universe is Unhidden ... this means that Science's “Cause and Effect” in this 10% has to be fiction: because scientists assume that the 90+% of the Universe, that is Hidden, does not influence the <10% of the Universe that appears Unhidden. ... it is like saying that the operating-system of a computer has nothing to do with what appears on the computer-monitor... it is like saying that the part of the iceberg in the water has nothing to do with the 10% of the iceberg above water.... it is like saying that the Hidden (invisible) gravity has nothing to do with birds flying and planes-falling and Universes expanding... it is like saying the waves on the ocean have nothing to do with the ocean....

 

Cause and Effect is especially fiction because on all its limitless-levels and fathomless-depths literally Nothing in the Universe is what it appears to be: be it Particle, Photon, Wave or probability-cloud... Cause and Effect ... and even far-more especially ... Physics tells us with the utmost certainty of its absolutely certain Uncertainty Principle: that on all the limitless-levels and fathomless-depths of the Universe the Observer -- and NOT the observation -- determines the Observation, be it Cause or Effect.

 

These same geniuses in Science however insist that their Cause and Effect in the <1% of the Universe that is Unhidden is more or less unaffected by the >99% of the Universe that is Hidden ... which makes them a little wiser than God and all his prophets who have yet to discover that >99% of the Universe is REALLY missing which leaves <1% for God to play with .. so much less than 1% that it has to be exactly like god: Nothing but thoughts.

 

-- Wreally Reality

Link to comment
Share on other sites

First of all, theories are not "absolute certainty". Science is amenable to change.

 

Secondly, theories such as dark matter, dark energy, etc. come about from an unknown force acting upon visible matter. These things have basis in observation.

 

It's important to make the distinction between the content of the universe and our knowledge of it. This does not mean that we only have 4% knowledge of the universe. It simply means that is the amount of EMR information that we are able to process and gather, at this point in time.

 

Overall though, I agree that our understanding of the universe is in its infancy. What a time to be alive! :cup:

Link to comment
Share on other sites

Casimir effect, Lamb shift, Rabi vacuum oscillation, electron anomalous g-factor - explain each observed effect within within your paradigm. Galaxy rotation curves with increasing radius and gravitational lensing - explain each observed effect within your paradigm. You cannot. You are whining hot air while others accurately model empirical observation with mathematics... that is accurately predictive of new observations,

 

LAMBDA - Wilkinson Microwave Anisotropy Probe

 

"It doesn't matter how beautiful your theory is, it doesn't matter how smart you are. If it doesn't agree with experiment, it's wrong." Richard Feynman.

Link to comment
Share on other sites

because scientists assume that the 90+% of the Universe, that is Hidden, does not influence the <10% of the Universe that appears Unhidden.
They do not assume this, they assume that the only influence is gravitational. There would be no point in them assuming what you say because it would remove the very purpose of assuming something at all: a consequence that could be the cause of what is observed.
Link to comment
Share on other sites

Genep does make a valid point. If we give credibility to all this mystery matter, then we have to include its impact. If we include its impact, then the cause and affect of existing relationships need to be modified with a currently unknown variable causing the rational to become irrational.

 

Let me give an analogy. We enter a child's room to see his toys spread out over the floor. We rationally conclude he has been playing. If we add the variable of the boogie man in the closet, then who put what, where? It is not clear cut if the child has been playing, or to what extent. To make things easier, we will leave the boogie man in the closet for now. Even if he is coming out, we will ignore this, until we understand his nature and can better predict his behavior. Then and only then, will we tell everyone the child never was playing everyday with all his toys. To cut to the chase, this implies what we teach is going to become obsolete someday due to BM, yet we will pretend it reflects reality until we decide to change.

Link to comment
Share on other sites

(homage paid to the Sergio Leone film)

"It doesn't matter how beautiful your theory is, it doesn't matter how smart you are. If it doesn't agree with experiment, it's wrong." Richard Feynman.
Or your assumptions that your experiments were based upon are wrong.
Scientific theories must make well-defined predictions that can be experimentally tested. These well-defined predictions must include the assumptions upon which these experiments are based. So, if these assumption are wrong, the theory is wrong.

 

The non-scientific use of the term “theory” often lack this requirement. However, if a candidate scientific theory that fails to make well-defined predictions is not a scientific theory at all. Such pseudo-theories are described by another famous quotation of a well-known physicist who UncleAl is fond of quoting:

"Das ist nicht nur nicht richtig, es ist nicht einmal falsch!"

(creative English translation: "That is not even good enough to be wrong")

- Wolfgang Pauli

In my experience, scientists and science enthusiasts have ample sympathy, and even affection, for beautiful yet wrong theories. Mathematicians often find them compelling and absorbing. “Not even good enough to be wrong” pretensions to theory, however, are in a profound sense ugly, and little loved by any of these folk.

 

If there aren’t testable predictions, it’s not science.

Link to comment
Share on other sites

Scientific theories must make well-defined predictions that can be experimentally tested. These well-defined predictions must include the assumptions upon which these experiments are based. So, if these assumption are wrong, the theory is wrong.

 

While this may be so from Newtons time on, the human race couldn't have got to that stage without developments and methodologies going back to the ancient greeks. While science has been a continuous process of evaluation and re-evaluation of repeatable experiments (thought as well as physical for the philosophical roots) has anybody calculated an accuracy rate for science?

 

While this calculation would have to occur 'after the event' and be limited by the level of knowledge 'after' compared with 'before' there would have to be certain rules on how past knowledge would be quantified, as certain proportions of past knowledge will become fallacious while other fallacious past knowledge (sun around earth vs earth around sun etc) becomes real knowledge.

 

If we used a simplistic factor (as a start disregarding incomplete/destroyed/hidden knowledge) of now known false science/historic scientific truths and calculate the ratio can we expect this ratio to stay around the same as our 'scientific knowledge' expands?

 

How would this ratio pan out over the past 100 years, 200 years or even 2,500 years? How would this ratio calculate for people 200 years ago comparing with 500 years ago etc.

 

Considering that we've done such a good job ruining our planet (just one example) recently couldn't this ratio also be adapted to add the 'bad' science onto the negative side of the ledger and the 'good' science onto the positive to see if the human race gets above zero?

 

The answer is probably an imaginary number.

Link to comment
Share on other sites

(homage paid to the Sergio Leone film)Scientific theories must make well-defined predictions that can be experimentally tested. These well-defined predictions must include the assumptions upon which these experiments are based. So, if these assumption are wrong, the theory is wrong.

 

The non-scientific use of the term “theory” often lack this requirement. However, if a candidate scientific theory that fails to make well-defined predictions is not a scientific theory at all. Such pseudo-theories are described by another famous quotation of a well-known physicist who UncleAl is fond of quoting:

"Das ist nicht nur nicht richtig, es ist nicht einmal falsch!"

(creative English translation: "That is not even good enough to be wrong")

- Wolfgang Pauli

In my experience, scientists and science enthusiasts have ample sympathy, and even affection, for beautiful yet wrong theories. Mathematicians often find them compelling and absorbing. “Not even good enough to be wrong” pretensions to theory, however, are in a profound sense ugly, and little loved by any of these folk.

 

If there aren’t testable predictions, it’s not science.

 

 

 

However, unless you actually work as a scientist your personal explorations and learning's need not adhere to such rigorous experimental standards. In systems theory the approach is much broader, in that it provides connectives across multidisciplinary fields of study.

 

This type of personal study need not be encumbered by the usual divisions in science, therefore the priority shifts to understanding many scientific areas at once. The intent is for an individual to developed a personal lens to see the big picture using accumulated scientific knowledge. You may find the best way to develop your personal informational models is the "bootstrap approach".

 

This approach is only possible however because science has been so successful in the reductionist experiments of the past 400 yrs, and have accumulated this critical amount of ordered information about so many fields.

 

When a certain amount of accurate Information is amassed personally, across these many related fields, it will tend to follow its own rules of engagement, patterns will begin to emerge as information will tend to reach out at a critical threshold from one area to connect with another. This is true for the nature of the mind, as well as any developing system in nature. This view point can only be attained however when an individual has been studying many related sciences for many years as an intellectual pursuit.

Link to comment
Share on other sites

Bad assumptions produce experiments that do not agree with observation.

Show us where our experiments fail to agree with observation.

 

It happens all the time, that's why scientist do experiments, its more times than not a process of elimination rather than directly proving a hypothesis and elevating it to theory . But since your asking the question....... there is an excellent example of where observation and experiment do not agree, and has changed the fundamental way we think about reality in general.

 

 

Wave-Particle Duality

Here is a quick summary of quantum physics history: In 1900, Planck proposed that energy is emitted in amounts that are quantized, not continuously variable. In 1905, Einstein proposed that light (which we previously had considered to be a wave) is composed of photons that have characteristics of both a wave and a particle. In 1923, DeBroglie generalized this logic and proposed that electrons (which we previously had considered to be particles) also have a dual nature, with both wave and particle characteristics. In 1925, Schrodinger wrote the wave-equation for an electron. Within a few years, scientists were using quantum physics for a wide variety of physical phenomena, including the details of atomic spectra, formation of molecules from atoms, structure of the chemistry periodic table, and more.

 

The dual wave-and-particle nature of photons and electrons (and protons, neutrons,...) is unfamiliar and seems very strange, but it has been confirmed by many experiments. And all experimental observations have been consistent with predictions based on the principles and equations of quantum physics.

To cope with the weirdness of quantum physics: First you must recognize that, based on the way reality is described by quantum physics, "yes, things really are strange." Then you must use critical thinking for proper balance, to recognize that "no, things are not as strange as some people claim."

 

 

1B. The Uncertainty Principle

One result of wave-particle duality is a limit on the precision of measurements. In a standard illustration of the Uncertainty Principle, we shine light photons on a moving electron to determine the electron's location, but the interaction between photon and electron changes the electron's momentum (which is mass x velocity). Due to this change, there is a natural limit on how precisely we can measure the combination of location-and-momentum for the electron. The more precisely we know the location, the less precisely we can know the momentum, and vice versa. { note: The uncertainty principle also applies in other situations and for other combinations of attributes. }

This limitation is caused by the interaction between photon and electron, which will produce changes of momentum (for electron and photon) whether or not these changes are "observed" by a human and thus become a part of human knowledge. It is the interaction, not human consciousness, that is important in a cause-effect analysis based on quantum physics.

 

But wave/particle duality, and the associated Uncertainty Principle, is always an essential characteristic of nature, even when we're "not looking." For example, without its wavelike nature a negatively charged electron would cling tightly to a positively charged proton, forming a tiny negative-positive clump. If this happened, our universe would be boring and lifeless. But this doesn't happen because clinging would confine the electron to a very small space, so it would have a very precisely determined location but thus would have (as described in the uncertainty principle) a very large momentum, and thus a large velocity, which is incompatible with it clinging to the proton. Instead, the electron gets "close to a proton, but not too close" in a simple hydrogen atom.

Small-scale strangeness produces large-scale normality. Yes, the normal behavior that we see in our everyday level of experience is produced by strange behavior at the quantum level. Without wave/particle duality you would not be reading this web-page, because you would not be alive.

 

 

1C. A Two-Slit Experiment

The diagram below is a simplified sketch of an experiment in which electrons pass through two slits in a thin barrier and then hit a wall.

 

If the wall is a detector that records where each electron hits, we find that when a large number of electrons have hit the wall their hitting-locations form an interference pattern that is characteristic of waves. This pattern occurs due to the wavelike nature of electrons.

Although the equations of quantum physics do not predict where an individual electron will hit the wall, they do predict the probability of an electron hitting at each location on the wall, and thereby predict the pattern that will form when a large number of electrons have hit. The pattern predicted by quantum physics is the pattern that is observed.

electron as a wave: When an electron is traveling through the barrier-slits and toward the wall, it behaves like a wave. This lets a single electron, somewhat amazingly, "go through both slits" simultaneously, and these two parts of the electron will interact with each other to produce the wave-interference pattern predicted by quantum physics. But this self-interference (due to the electron's wavelike character) is not accompanied by the self-repulsion (due to the electron's charge) that we would expect if the electron was actually "smeared out" as it "goes through both slits" at the same time. Yes, it's very strange.

electron as a particle: When an electron hits the wall — when it interacts with atoms in the wall — it behaves like a particle, and what hits the wall is always a whole electron. By contrast, when an electron "goes through both slits" as a wave, it seems to be "split apart" although it isn't really split (in the way that we would visualize this) since there is no self-repulsion.

Obviously, our concepts of waves and particles — which are useful for describing familiar large-scale behaviors at our everyday level — are not sufficient for describing the unfamiliar small-scale behaviors of wave/particles at the quantum level.

Link to comment
Share on other sites

However, unless you actually work as a scientist your personal explorations and learning's need not adhere to such rigorous experimental standards. In systems theory the approach is much broader, in that it provides connectives across multidisciplinary fields of study.

 

This type of personal study need not be encumbered by the usual divisions in science, therefore the priority shifts to understanding many scientific areas at once. The intent is for an individual to developed a personal lens to see the big picture using accumulated scientific knowledge. You may find the best way to develop your personal informational models is the "bootstrap approach".

 

This approach is only possible however because science has been so successful in the reductionist experiments of the past 400 yrs, and have accumulated this critical amount of ordered information about so many fields.

 

When a certain amount of accurate Information is amassed personally, across these many related fields, it will tend to follow its own rules of engagement, patterns will begin to emerge as information will tend to reach out at a critical threshold from one area to connect with another. This is true for the nature of the mind, as well as any developing system in nature. This view point can only be attained however when an individual has been studying many related sciences for many years as an intellectual pursuit.

 

YES!

 

The more frames of reference we acquire the better we get at connecting the dots.

 

Old wives tales, science, the rantings of a madman. hell, I don't care, if it will get me from A-B then the job is done. That's when I worry about the details of why it works, once I have it working.

 

It took me 2 days study of related fields and one day on the tools to build an aquaponic system before there was any knowledge of how to do this on the web. I had never kept fish or built anything requiring plumbing before this.

 

So - 3 days to build it with no prior knowledge whatsoever of the technology including no fishkeeping plumbing or hydroponic experience.

 

Recycled parts, nothing spent.

 

A wide array of knowledge, a concept, and the suspension of disbelief and questioning (whereby the action stops), till the job is done. Then the questions.

 

If you read enough most of the answers are there and they're swimming in the ether in your head. To science I too am very grateful for the work done to date. The massive databank I get to use. I just couldn't do it, too slow.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...