Articles

Particle Accelerator – 2012 – 7 Reasons Why The World Ends On December 21, 2012

Scientists around the globe agree and are unanimously predicting that �life on Earth as we know it� could end on or around December 21st, 2012. Amongst them, several believe that it will be invoked by humans, but the majority is convinced that nature in a universal way will be the cause of the dramatic changes. Several religious leaders worldwide however are stating that it will be an act of God and that Judgment Day will come upon us at that time. Whichever theory you tend to follow, it is remarkable for once that all visions are on the same wavelength as to the impact on mother Earth. These are seven reasons why scientists and non-fiction predictors are convinced that the world as we know it will end on December 21st, 2012:

Reason #1 � The End of the Mayan Calendar:

The oldest and most researched source of the end was derived from the Mayan Calendar that indicates that times end on this day. The Mayans are considered a highly intelligent race who must have possessed far developed technologies which to date nobody alive can even start to comprehend. They were also a bloodthirsty race and some of their key characteristics involved building highly accurate astrological equipment out of stone and sacrificing virgins.

Not only did they live in an era immediately before the previous major global event, about 11,500 years ago, they were able to calculate the lunar moon as 329.53020days, only 34 seconds off.

The Mayan calendar has set the end of the Earth on December 21, 2012. Their prediction is to be taken serious as at that time, the accuracy of calculating the lunar moon so close, the likelihood of their calculation being accurate yet again is extremely high.

Reason #2 – Sun Storms:

The media in general does not like to cover topics that are hard to explain, except for some of the supernatural channels that are rarely visited by the self-claimed �realistic� viewers. However scientists are on the ball when it comes to observing the sun and its activities with the relatively high-tech equipment we know today. And they have made a mesmerizing discovery when it comes to its activities lately. Few years ago, the sun dramatically increased its radiation energy to levels much higher than registered in a long time. Its behavior definitively is cyclic and there is a clear 11+ year recurring cycle. Now, a couple of years after the last serious activity increase, 2012 announces the height of the 11+ years cycle announcing much higher activity than already measured in recent years.

Since the recent storms already accounted for knocking out power grids and destroying satellites, there is a very clear indication that the next round in 2012 will be of previously unseen magnitude impacting our planet and the technology in orbit of it. Calculations suggest it will reach its deadly peak sometime late in 2012.

Reason #3 – The European Atom Smasher:

European Scientists are finalizing the construction of the world�s largest particle accelerator ever conceived. To describe it in simplified terms, it consists of a 27 kilometers tunnel designed to smash atoms together in the anticipation of finding out what makes the Universe tick.

With reason, this mega-project has already caused serious concerns where several renowned scientists have expressed their deep concerns about flipping the ON-switch. Amongst their predictions of all kinds of lethal events, mini black holes are probable to be created. Chances are that the first mega-experiment with this latest invention may turn the globe into a matter no bigger than the size of a basketball.

Oh�that first experiment is scheduled for�2012.

Reason #4 � According to Religion

Several religions are giving us similar indications and predictions based on, amongst others, interpretations of the Bible, the I Ching (also known as the Chinese Book of Changes), as well as various sections within the Hindu teachings. All of them indicate the 21st of December 2012 as the end of the world. The Bible reveals this as the date of the Armageddon or the final battle between good and evil as well as Judgment Day.

It is expected that in the coming time as we approach D-Day, that more religious figures will speak up and lift the secrecy well kept by institutes such as the Vatican.

Reason #5 � Most Powerful Volcano on Earth

The largest volcano on earth is located underneath the Yellowstone National Park in the United States which is famous for its thermal springs and Old Faithful geyser. This fact has gotten geological experts very nervous and the reason for sleepless nights ever since it was linked to the upcoming events in 2012. The Yellowstone volcano has a pattern of erupting approximately every 650,000. Currently we are many years past the due date for a new major eruption, which in itself is rather bizarre since the cyclic events of the planet are usually pretty accurate and thus predictable. An new explosion will fill the atmosphere with ash, blocking the sun and plunging the Earth into a frozen winter that could last up to 15,000 years. Such a new Ice Age is due when looking at the usual cycles of Earth. And what is most alarming is that the pressure under the Yellowstone is building steadily, and geologists have set 2012 as a likely date for the big bang.

Reason #6 – The Physicists

Physicists at Berkeley University in the United States who use super computers to complete their highly complex formulas and calculations have been crunching the numbers and they�ve concluded as well that the Earth is well overdue for a major catastrophic event. Their calculations prove beyond any doubt that if and when this catastrophe happens, it will impact every living being on Earth and basically wipe away entire species as we know them today. What is really scary about their calculations is the fact that they are pretty much convinced, with a certainty of 99% that all of this will come down very soon and that the best guess date is�2012.

Reason #7 � Polar Shift

The magnetic field surrounding Earth shields us from most of the sun�s radiation. This is generally know, yet what is less known is that the magnetic poles we call north and south have a habit of swapping places approximately every 750,000 years. Right the cycle is overdue by about 30,000 years.

Furthermore, scientific research has shown that the poles are drifting apart roughly 20 to 30 kilometers every year, which is much faster than ever before, and which points to a pole-shift being very nearby. During the process of a pole shift, the magnetic field is disrupted until it disappears. Sometimes it is absent for up to 100 years. As a result, there is sufficient UV radiation in the air to crisp your skin in seconds, killing everything it touches.

About The Author
Learn much more about the significance and severity of the events in 2012 and how you can be prepared to save yourself and your loved ones. In addition, we�ll give you a free report on the �Labyrinth of Egypt� when you visit http://2012pro.com/ today!

Wednesday, March 17th, 2010 Articles Comments Off on Particle Accelerator – 2012 – 7 Reasons Why The World Ends On December 21, 2012

Particle Beams – Particle Beams Sending Specs of Dust Back in Time

Can a particle beam under the right conditions send specs of dust back into time? The answer is yes this could be possible under the right conditions, even more far out is the idea of sending dust to a set of coordinates in a region in four dimensions or a point in the atmosphere to start a cloud formation or make rain, or a huge dust cloud to shield top secret weaponry during prototype tests in the middle of a war. If you can send particles somewhere or through space to a certain coordinate or into a different time or onto another planet. You could actually terra form Mars or a distant world before you got there by sending the seeds of life to get started.

Think of it this way we are working on send in frequencies through area to other areas by bouncing them off rock or harder substances. And we have learned we can create weather both long range and years into the future with ELF. If you could send into the areas of a Plague periods of the past a biological pathogen that feed upon the Plague but did not hurt humans then you could stop the Plague from killing so many people. Would that mean people would immediately start to exist everywhere? Maybe, wouldn’t that be cool or maybe…No, you would have created a separate possible future. The terra forming aspects are quite interesting and relevant;

[http://www.spacetoday.org/SolSys/Mars/MarsSolSysExplore.html]

Also if there was a problem that you knew of in the past you could send in the cure to achieve wonderful results. But we are aware of the possibility of sending frequencies in time thus the movie Frequency that did an okay job discussing the theory. But maybe the theory is not too far off in our present abilities to unlock the process. If you are sending a particle beam which is using a light wave to transport the beam containing the particles then you can slow the beam, bend the beam, or send the beam anywhere you wish, the problem is power and wavelength needed and many other very complicated mathematical equations, but it would seem not as difficult as once thought knowing our current technology.

http://members.tripod.com/~MikeNight/time.html

Now if we could send back in time the things we know or if we receive the things we know from those sending thoughts back in time to those who are able to receive such thoughts, open minded ness, similar frequency on the brain to pick up the ELF signals. Or those who have purposely gone out of their way to receive such data then, we as the human race could continually receive needed information due to the fact our brains have adapted to do so. Now then in deep thought of 1-4 hertz and ungrounded brain such as that in space, air or soaring in a balloon would pick up thoughts and ELF better since their antenna, brain was not grounded, it would work better for sending signals and collecting certain data transmission sent by quark or anti-quark from future time periods. Such is the possibilities of certain wavelengths and frequencies in various realms of the spectrum of light and other soon to be recognized scientific data we now call paranormal. Is it paranormal? Probably not and it also might not be as weird as once thought. Such signals sent out before could pass to us in the future or we could intercept occasionally by plugging into or listening or attempting to receive such data.

Should we do this? Yes, it appears that the linear time we appear to so easily understand is more difficult a concept to comprehend than the fact that all time is one time and we are also living and thinking in all time periods simultaneously having only the foresight to see and touch what we believe to be present time. Time is a very cool concept. When people ask are you living in the past, present or future, you answer should be YES> Yes We are. We because if we are all living in all time then we are actually all one since we are made up of DNA parts of the entire and we are all related to the original of past and accumulation of the future. If we or an individual of we makes a certain action it affects all time yes past too. Since past and present and future are all one. Which is actually a much simpler concept and easier to comprehend than the questions so many wish to ask such as and so many religions attempt to answer without leaving loose ends where they might actually have to prove it, good book to read is the “God Gene”, oh here are those quesitons;

“Where did I come from”

“How did the world Start”

“What will happen to me when I die” and

“What will happen to the world when it ends”

Why do all religions move to answer these questions? Whether we understand the answers or not, or whether we choose to recognize it or not, we seem to innately need more answers, so then for piece of mind: Is it proper to communicate with the past and future? Should we tempt fate, will we believe what we find? Well in a way we already communicate thru time in a way. After all we live, have lived or will live in the present, past and future and whereas that does not span all that much in a linear time sense it still is living in and thru time. So may as well learn more and help us move forward as one. This is way off tangent from the original mind exercise of the theory of moving dust through time, or is it? Well the fact is we have questions and we seek answers and we have the technology to move into other dimensions now, so should we move to the nect step and seek the answers, do we dare? Does our society so depend on certain truths that we don’t dare? Are we confident enough in our beliefs to prove ourselves right? Wrong? Then what? Does society break down? Or do we as a species grow?

If you can move dust particles maybe you can move nano particles and thus move organic or building blocks through time to a place of your choosing. Move DNA, RNA, Nano-Devices to another place and another time to start life with the same components you wish to promote, ah ha so are we someone else’s experiment thru time, terra forming, a part of another. Sending instructions of how to form life to a place life will flourish one of Saturn’s Moons with water ice or under the surface of Mars where water exists does not take the capability of sending thru time, only to a near planet. If we can do this should we try for a greater distance or different time or is it “all relative” anyway and we are trapped here in linear time only because we have not broken the code yet? Will super computers and quantum mechanics allow us the next-step.

Perhaps local solar system moons with volcanic heat coming from the moon’s or planet’s interior or from a Sun near a planet is a good place to start or practice this concept since many exist in the similar fashion as Earth with similar ratios of elements. They may not contain the exact components that make up our atmosphere, but certainly enough to support some sort of life similar to our current DNA or current DNA we can find here on Earth. After all, human DNA and Wheat DNA are quite similar. But it is life as we know it and we could send such anywhere we want to, if we chose to and if we feel it appropriate to from an ethics standpoint. If people could throw away their notions of ugly and pretty based on their current perception from their viewpoint of how their eye sight and brain interpret beauty or substance then it might be easier and with proper understanding it is possible, sure we love blue rivers and blue sky, green forests, yet is that so important to negate the possibilities of suitable other worlds? We are missing the best part of reality when we deny ourselves to study and open our minds to the data before us. Regarding all the conspiracy theories of this

[http://phoenix.akasha.de/~aton/TG-ELF.html]

this is all fun and games, but the reality of this technology can really open the human mind to a much needed boost of input. And allow us harmony with all species and understanding of who we are and why we are here. Open your mind and vote for more research spending on these things. Think of a system set up to allow you to receive and email from a family member in the future or being able in the future to send and receive emails from the future, describing the future and warning about the future. Pretty cool. Think of this in the realm of 5000 years in the future and arrange meetings to visit it. This will be possible within the next 100 years or less at the rate of new technology discoveries.

What are the possibilities in the here and now for such incredible data and knowledge? Think in the future you could meet the past twenty generations of your family ask them questions and they you. Wouldn’t they be proud of you and wouldn’t it make you want to not let them down. Perhaps this might be the future of the village it takes to raise a child, you correctly. This has been a twilight zone thought of the day. What say you?

Wednesday, March 17th, 2010 Articles Comments Off on Particle Beams – Particle Beams Sending Specs of Dust Back in Time

The Voltaire Lecture 2010 by Professor Brian Cox

Professor Brian Cox speaks on “The value of Big Science: CERN, the LHC and the exploration of the Universe”

6th April 2010

Chaired by Polly Toynbee, President of the British Humanist Association.

Professor Brian Cox holds a chair in particle physics at the University of Manchester and works on the ATLAS experiment at the Large Hadron Collider, CERN near Geneva.

Full details and tickets available here

Tags: , , , ,

Friday, March 12th, 2010 Articles Comments Off on The Voltaire Lecture 2010 by Professor Brian Cox

Lhc – Quarks, Big Bang and Large Hadron Collider (LHC)

People are curios. Curiosity is one of the key elements that drives humanity towards the answers about our existence and our future. Since ancient times people have asked themselves about the origins of everything. From universe to the basic elements of the matter. Some people many thousands years ago assumed that if you divide some piece of matter this division must come to an end. This process should end with the basic, indivisible elements that constitute matter–atoms.

In the last centuries many experiments have confirmed that the matter is indeed consisted of some small particles. Scientific approach has contributed to the discovery of various natural and synthetic substances, molecules, chemical elements and atoms. Atoms, once believed to be indivisible, were also found to have some hard nucleus with electrons orbiting around it. Then it was discovered that the atom nucleus is consisted of protons and neutrons. So the atoms are divisible. This fact had many consequences. One of them with the most notable effect is fission nuclear bomb. However, the division story didn’t end there. Protons and neutrons were also found to contain some smaller particles–quarks.

Currently the list of all elementary particles is pretty long. This list is part of the Standard model–a model of how everything exists and interacts. It is believed that this model is not the final picture of the universe. There are still some unanswered questions. On the other hand, the universe itself is a subject of investigation. One of the key discoveries was that the universe is expanding. From this fact we can conclude that in the past the universe was smaller. The more we go into the past, the smaller it was. Sooner or later we come to the moment in time where the universe was infinitely small. This is called the Big Bang–the moment when the universe started to develop as we know it today, some 13.7 billion years ago. This is now the leading theory about the evolution of the universe. It is still unknown what banged, how and why.

The latest project to find some missing answers is the Large Hadron Collider (LHC) in CERN, Geneva. It is a giant ring 100 meters under ground where two beams of particles close to light speed will collide. Each collision will produce an enormous amount of other particles. Analysis of this debris will hopefully answer some questions about the nature of particles or even bring some new ones. Because of enormous collision energy (about 14 TeV) the circumstances will be close to the situation immediately after big bang. The LHC project is currently the largest and the most expensive scientific project.

Answering questions about micro and macro world will not only satisfy our curiosity but will also help us to understand the world. If we understand the world then we can make it better. And better world is a dream of everybody.

Article Source: http://EzineArticles.com/?expert=Jan_Pascal

Tags: , , ,

Thursday, March 11th, 2010 Articles Comments Off on Lhc – Quarks, Big Bang and Large Hadron Collider (LHC)

High Energy Physics – Pure Derivation of the Exact Fine – Structure Constant and As a Ratio of Two Inexact Metric Constant

Theorists at the Strings Conference in July of 2000 were asked what mysteries remain to be revealed in the 21st century. Participants were invited to help formulate the ten most important unsolved problems in fundamental physics, which were finally selected and ranked by a distinguished panel of David Gross, Edward Witten and Michael Duff. No questions were more worthy than the first two problems respectively posed by Gross and Witten: #1: Are all the (measurable) dimensionless parameters that characterize the physical universe calculable in principle or are some merely determined by historical or quantum mechanical accident and incalculable? #2: How can quantum gravity help explain the origin of the universe?

A newspaper article about these millennial mysteries expressed some interesting comments about the #1 question. Perhaps Einstein indeed “put it more crisply: Did God have a choice in creating the universe?” – which summarizes quandary #2 as well. While certainly the Eternal One ‘may’ have had a ‘choice’ in Creation, the following arguments will conclude that the reply to Einstein’s question is an emphatic “No.” For even more certainly a full spectrum of unprecedented, precise fundamental physical parameters are demonstrably calculable within a single dimensionless Universal system that naturally comprises a literal “Monolith.”

Likewise the article went on to ask if the speed of light, Planck’s constant and electric charge are indiscriminately determined – “or do the values have to be what they are because of some deep, hidden logic. These kinds of questions come to a point with a conundrum involving a mysterious number called alpha. If you square the charge of the electron and then divide it by the speed of light times Planck’s (‘reduced’) constant (multiplied by 4p times the vacuum permittivity), all the (metric) dimensions (of mass, time and distance) cancel out, yielding a so-called “pure number” – alpha, which is just over 1/137. But why is it not precisely 1/137 or some other value entirely? Physicists and even mystics have tried in vain to explain why.”

Which is to say that while constants such as a fundamental particle mass can be expressed as a dimensionless relationship relative to the Planck scale or ratio to a somewhat more precisely known or available unit of mass, the inverse of the electromagnetic coupling constant alpha is uniquely dimensionless as a pure ‘fine-structure number’ a ~ 137.036. On the other hand, assuming a unique, invariantly discrete or exact fine-structure numeric exists as a “literal constant,” the value must still be empirically confirmed as a ratio of two inexactly determinable ‘metric constants,’ h-bar and electric charge e (light speed c being exactly defined in the 1983 adoption of the SI convention as an integer number of meters per second.)

So though this conundrum has been deeply puzzling almost from its inception, my impression upon reading this article in a morning paper was utter amazement a numerological issue of invariance merited such distinction by eminent modern authorities. For I’d been obliquely obsessed with the fs-number in the context of my colleague A. J. Meyer’s model for a number of years, but had come to accept it’s experimental determination in practice, pondering the dimensionless issue periodically to no avail. Gross’s question thus served as a catalyst from my complacency; recognizing a unique position as the only fellow who could provide a categorically complete and consistent answer in the context of Meyer’s main fundamental parameter. Still, my pretentious instincts led to two months of inane intellectual posturing until sanely repeating a simple procedure explored a few years earlier. I merely looked at the result using the 98-00 CODATA value of a, and the following solution immediately struck with full heuristic force.

For the fine-structure ratio effectively quantizes (via h-bar) the electromagnetic coupling between a discrete unit of electric charge (e) and a photon of light; in the same sense an integer is discretely ‘quantized’ compared to the ‘fractional continuum’ between it and 240 or 242. One can easily see what this means by considering another integer, 203, from which we subtract the 2-based exponential of the square of 2pi. Now add the inverse of 241 to the resultant number, multiplying the product by the natural log of 2. It follows that this pure calculation of the fine-structure number exactly equals 137.0359996502301… – which here (/100) is given to 15, but is calculable to any number of decimal places.

By comparison, given the experimental uncertainty in h-bar and e, the NIST evaluation varies up or down around the mid 6 of ‘965’ in the invariant sequence defined above. The following table according gives the values of h-bar, e, their calculated ratio as and the actual NIST choice for a in each year of their archives, as well as the 1973 CODATA, where the standard two digit +/- experimental uncertainty is in bold type within parentheses.

year…h- = Nh*10^-34 Js…… e = Ne*10^-19 C….. h/e^2 = a =….. NIST value & ±(SD):

2006: 1.054.571 628(053) 1.602.176 487(040) 137.035.999.661 137.035.999 679(094)

2002: 1.054.571 680(18x) 1.602.176 53o(14o) 137.035.999.062 137.035.999 11o(46o)

1998: 1.054.571 596(082) 1.602.176 462(063) 137.035.999.779 137.035.999 76o(50o)

1986: 1.054.572 66x(63x) 1.602.177 33x(49x) 137.035.989.558 137.035.989 5xx(61xx)

1973: 1.054.588 7xx(57xx) 1.602.189 2xx(46xx) 137.036.043.335 137.036. 04x(11x)

So it seems the NIST choice is roughly determined by the measured values for h and e alone. However, as explained at http://physics.nist.gov/cuu/Constants/alpha.html, by the 80’s interest shifted to a new approach that provides a direct determination of a by exploiting the quantum Hall effect, as independently corroborated with both theory and experiment of the electron magnetic-moment anomaly, thus reducing its already finer tuned uncertainty. Yet it took 20 years before an improved measure of the magnetic moment g/2-factor was published in mid 2006, where this group’s (led by Gabrielse for Hussle at Harvard.edu) first estimate for a was (A:) 137.035999. 710(096) – explaining the much reduced uncertainty in the new NIST list, as compared to that in h-bar and e. However, more recently a numeric error in the initial QED calculation (A:) was discovered (we’ll refer to it as 2nd paper B:) which shifted the value of a to (B:) 137.035999. 070 (098).

Though it reflects a nearly identically small uncertainty, this assessment is clearly outside the NIST value concordant with estimates for h-bar and elementary charge, which are independently determined by various experiments. The NIST has three years to sort this out, but meantime face an embarrassing irony in that at least the 06-choices for h-bar and e seem to be slightly skewed toward the expected fit for a! For example, adjusting the last three digits of the 06-data for h and e to accord with our pure fs-number yields an imperceivable adjustment to e alone into the ratio h628/e487.065. Had the QCD error been corrected prior to the actual NIST publication in 2007, it rather easily could have been evenly adjusted to h626/e489; though questioning its coherency in the last 3-digits of a with respect to the comparative 02 and 98 data. In any case, far vaster improvements in multiple experimental designs will be required for a comparable reduction in error for h and e in order to settle this issue for good.

But again, even then no matter how ‘precisely’ metric measure is maintained, it’s still infinitely short of ‘literal exactitude,’ while our pure fs-number fits the present values of h628/e487quite precisely. In the former regard, I recently discovered a mathematician named James Gilson (see http://www.maths.qmul.ac.uk/%7Ejgg/page5.html ) also devised a pure numeric = 137.0359997867… nearer the revised 98-01 standard. Gilson further contends he’s calculated numerous parameters of the standard model such as the dimensionless ratio between the masses of a Z and W weak gauge boson. But I know he could never construct a single Proof employing equivalences capable of deriving Z and/or W masses per se from then precisely confirmed masses of heavy quarks and Higgs fields (see essay referenced in the resource box), which themselves result from a single over-riding dimensionless tautology. For the numeric discreteness of the fraction 1/241 allows one to construct physically meaningful dimensionless equations. If one instead took Gilson’s numerology, or the refined empirical value of Gabreilse et. al., for the fs-number, it would destroy this discreteness, precise self-consistency and ability to even write a meaningful dimensionless equation! By contrast, perhaps it’s then not too surprising that after I literally ‘found’ the integer 241 and derived the exact fine-structure number from the resultant ‘Monolith Number,’ it took only about 2 weeks to calculate all six quark masses utilizing real dimensionless analysis and various fine-structured relations.

But as we now aren’t really talking about the fine-structure number per se any more than the integer 137, the result definitively answers Gross’s question. For those “dimensionless parameters that characterize the physical universe” (including alpha) are ratios between selected metric parameters that lack a single unified dimensionless system of mapping from which metric parameters like particle masses are calculated from set equations. The ‘standard model’ gives one a single system of parameters, but no means to calculate or predict any one and/or all within a single system – thus the experimental parameters are put in by hand arbitrarily.

Final irony: I’m doomed to be demeaned as a ‘numerologist’ by ‘experimentalists’ who continually fail to recognize a hard empirical proof for quark, Higgs or hadron masses that may be used to exactly calculate the present standard for the most precisely known and heaviest mass in high energy physics (the Z). So contraire foolish ghouls: empiric confirmation is just the final cherry the chef puts on top before he presents a “Pudding Proof” no sentient being could resist just because he didn’t assemble it himself, so instead makes a mimicked mess the real deal doesn’t resemble. For the base of this pudding is made from melons I call Mumbers, which are really just numbers, pure and simple!

Tags: , , ,

Thursday, March 11th, 2010 Articles Comments Off on High Energy Physics – Pure Derivation of the Exact Fine – Structure Constant and As a Ratio of Two Inexact Metric Constant

The LHC 2010 – 2011

The Large Hadron Collider overview for the next 18 to 24 months with start of physics at 7 TeV by Rolf Heuer, Director General of CERN

read the full article here

Tags: , ,

Wednesday, March 10th, 2010 Articles Comments Off on The LHC 2010 – 2011

Bottom Quark – The Z-Boson Mass And Its Formula As Multiple Proofs In One Yummy Bowl Of Pudding

Though its origin is disputed, the phrase “the proof of the pudding is in the eating” is popularly attributed to Cervantes 1615 comic novel Don Quixote. And while one can talk about a puddings ingredients all they want, the sayings meaning stays intact when shortened to the “proof is in the pudding” – because that is where you will ultimately find it, if you bother to at least taste it, as it is the results that count.

Which is unlike a ‘mathematical proof’ obtained by logic alone since one’s pallet will sometimes disagree with what one thinks is a delicious recipe. In this sense, the implied dichotomy is akin to Kepler’s contribution to elliptic geometry, which per se is independent of experience in the sense that elliptic theorems can be constructed and proven without appeals to any physical phenomena. But in practice Kepler refined Copernicus’s resurrected heliocentric heresy of planetary orbits in a manner that just as clearly is non-abstractly physical and empirically testable. Which ultimately is a key characteristic of the scientific method or ‘revolution,’ soon cemented by Newton and Galileo’s discoveries expressing physical laws by experimental confirmation of their mathematical formulation.

This report accordingly will further pare the phrase down to a “Pudding Proof” that employs a number of verifications of what a physical formula represents, not only being theoretically correct in multiple senses, but confirmed to be correct by a clear correspondence with the most precisely measured empirical value in high-energy particle physics, specifically the neutral weak or Z-boson mass. For its present measure value of 91187.6 (2.1) MeV is what truly represents the operative meaning of this term with respect to being the ultimate result as ‘physical proof’ of the following equation for the precise mass of the Z-boson: Z = 91187.633 MeV = 9u1/8 + ms – mb, though one then doesn’t really need to know the mass m of the strange and bottom quarks, or the Higgs vacuum minimum u1.

Likewise, how we obtained these other, presently ‘unknown,’ values isn’t at issue either, though obviously it was not achieved by empirical measure nor is related to this equation. Which isn’t meant to squelch natural curiosity of course, as anyone interested in the history of these discoveries is directed to a recent essay (available from any report directory) summarizing the dimensionless scaling system of physics that generates the gamete of such fundamental physical constants. In any case, assuming I’m not lying (which is just as provable – any bets?), these ‘ill-measured masses’ contribute to this equation to give the above Z-mass that corresponds precisely with its measured mass average. So this ‘pudding proof’ refers less to to the measurable Z-mass, but more importantly empirically implies that these three non-given fundamental masses are just as precisely determined and confirmed as proven mass values as the Z itself.

Though this empirical pudding proof is unprecedented with regard to the convincing precision of a parameter such as a strange or bottom quark mass (that can’t be directly measured as confirmable anyway), it certainly remains an outstanding example of the validity of physical measure as the bedrock of scientific method. For the ultimate strength of the underlying numerical scaling system that sets it apart from other modern ‘theoretical models’ is evident from the raft of confirmable predictions it makes – and largely are presently accessible in well-tested standard contexts (such as the Z) that require no greater experimentally contrived studies to ‘test’ whether some ‘theoretical interpretation’ is ‘correct.’

Indeed, the equation for the Z-mass itself represents multiple theoretical proofs that strengthen the outstanding empirical correspondence with the pudding of its measured mass. The first matter in this regard straddles both spheres in that the predominant observed or theoretical decay products of a weak neutral boson are admixtures of bottom with strange and/or down quarks in heavy mesons, and practically is the only known particle that can directly decay to a strange Bs-meson. Which according to our equation consists of subtracting a (like) -e/3-charged b-quark and adding a (like) +e/3-charged strange antiquark – which thus assures the charge neutrality of a Bs-meson. Then over and above these theoretical considerations with respect to the equation’s quarks, there looms the fundamental observations of Peter Higgs concerning the origin of mass in general, and specifically with respect to electroweak symmetry breaking by which the weak Z and W gauge bosons acquire a mass from some mechanism while leaving photons massless. The above equation employs the Higgs field best termed the vacuum minima u1, which is generically associated with the 3rd ‘generation’ bottom of the -1e/3 ‘down quark family,’ in the same sense that the heavier ‘Higgs vacuum doubletu2 represents a neutral pair of tops of the +2e/3 ‘up quark family.’ (Incidentally cognoscenti, they saw evidence of the ‘light Higgs boson‘ before CERN replaced the lepton collider with the Large Hadron over five years ago, which thankfully will generate the far more fundamental Heavy Higgs scalar – when that pudding is ready to be taken out of their oven. [So though reaching the heavy Higgs energy at will justify CERN’s efforts, its mass more importantly sets the scale for SUSY; but should be a bigger deal still when they witness baryogenesis {creation of nuclear matter over antimatter}! For that is not explained by any existing theory, and certainly no ‘standard model!’])

Actually the above equation is one of two expressions for the Z-mass, and the other naturally involves its relation to the charged W-boson mass. The W itself is a predominant decay product from the heavier Higgs doublet, where convention has the +2e/3 top imparting its +charge to the W in mediating a transformation to a -1e/3 bottom. So once again the Higgs fields impart their mass to quarks and gauge bosons, where each theoretical argument reinforces others (there being further supporting pudding proofs that involve equations for neutral and charged pairs of B-mesons that reinforce the basic equation for Z-mass, for example.) And each theoretical nuance is supported by the latest measures of equally subtle masses. But the mathematical form of these equations give insights into theoretic and predictive empirical realms that are unavailable in any other scheme. Example: Let’s give a hundred dollars (I’d make it more but care too much to be going broke) to anyone who can find a reference containing the above equation for the Z-mass.

Having established that theoretically it’s a perfectly good equation, there should be some possibility it is not unique. But I highly doubt it would ever have been published, especially without any knowledge of these other parameters; that I can safely assume are within my copyrights if just because of the strength of this Pudding Proof demands it. Which brings us back to the basic meaning of this old saying – the results are in the tasting and eating of the pudding. And the bottom line test of this principle after the above equation has been posted for six years on this web of the so-called information highway is this – I have yet to find an individual who is capable of appreciating a pudding full of yummy plums and proofs, let alone anyone who wants to buy a small bowl to eat any and taste the results for themselves. But real pudding isn’t intended for authorities who only speak with forked tongues, it’s made for the likes of you and I who experience the joys of eating or speaking with one tongue – yum!

Tags: , ,

Wednesday, March 10th, 2010 Articles Comments Off on Bottom Quark – The Z-Boson Mass And Its Formula As Multiple Proofs In One Yummy Bowl Of Pudding

The Large Hadron Collider chosen

1991 In December, CERN’s Council delegates agree unanimously that the Large Hadron Collider (LHC) is the right machine for further significant advance in the field of high-energy physics research and for the future of CERN.

Tags: , , ,

Friday, March 5th, 2010 Articles Comments Off on The Large Hadron Collider chosen