Large Hadron Collider – The Large Hadron Collider in Geneva – Risks?

As the LHC should be ready for firing at full power at some point in the near future, I was wondering what the implications might be if something went wrong with the machine. I've heard theories ranging anything from the ultimate collapse of the space-time continuum and physics as we know it, to all of the quantum interference of a mosquito farting. Does anyone have an opinion on the LHC's risks and possible negative effects? Many thanks for your time.

Those theories aren't based on science. They have as much validity as the 2012 nonsense, and they've been replacing 2012 here in the last few weeks. In fact, some trolls seem to have set up accounts specifically for the purpose of asking "Is the LHC gonna end the world?" a couple of times a day.

There aren't any risks. The energies and more occur routinely in the upper atmosphere as natural processes every day. And the upper atmosphere has not created any black holes or ended the world so far as I know.

The largest risk is that some idiot paranoid crazy people try to destroy the LHC, causing incredible dollar amounts of damage and hurting people operating it. This will infuriate scientists and people who use logic around the globe, starting a war between people who use science and people who don't. Of course, the people who use science have created every weapon and communications system in the world, so they will win. Without moron conspiracy theorists around there will be no impediments to scientific progress and a wonderful new world will come about.

So that's the biggest risk to you conspiracy people.

Wednesday, March 17th, 2010 Questions Comments Off on Large Hadron Collider – The Large Hadron Collider in Geneva – Risks?

Particle Beams – Particle Beams Sending Specs of Dust Back in Time

Can a particle beam under the right conditions send specs of dust back into time? The answer is yes this could be possible under the right conditions, even more far out is the idea of sending dust to a set of coordinates in a region in four dimensions or a point in the atmosphere to start a cloud formation or make rain, or a huge dust cloud to shield top secret weaponry during prototype tests in the middle of a war. If you can send particles somewhere or through space to a certain coordinate or into a different time or onto another planet. You could actually terra form Mars or a distant world before you got there by sending the seeds of life to get started.

Think of it this way we are working on send in frequencies through area to other areas by bouncing them off rock or harder substances. And we have learned we can create weather both long range and years into the future with ELF. If you could send into the areas of a Plague periods of the past a biological pathogen that feed upon the Plague but did not hurt humans then you could stop the Plague from killing so many people. Would that mean people would immediately start to exist everywhere? Maybe, wouldn’t that be cool or maybe…No, you would have created a separate possible future. The terra forming aspects are quite interesting and relevant;


Also if there was a problem that you knew of in the past you could send in the cure to achieve wonderful results. But we are aware of the possibility of sending frequencies in time thus the movie Frequency that did an okay job discussing the theory. But maybe the theory is not too far off in our present abilities to unlock the process. If you are sending a particle beam which is using a light wave to transport the beam containing the particles then you can slow the beam, bend the beam, or send the beam anywhere you wish, the problem is power and wavelength needed and many other very complicated mathematical equations, but it would seem not as difficult as once thought knowing our current technology.

Now if we could send back in time the things we know or if we receive the things we know from those sending thoughts back in time to those who are able to receive such thoughts, open minded ness, similar frequency on the brain to pick up the ELF signals. Or those who have purposely gone out of their way to receive such data then, we as the human race could continually receive needed information due to the fact our brains have adapted to do so. Now then in deep thought of 1-4 hertz and ungrounded brain such as that in space, air or soaring in a balloon would pick up thoughts and ELF better since their antenna, brain was not grounded, it would work better for sending signals and collecting certain data transmission sent by quark or anti-quark from future time periods. Such is the possibilities of certain wavelengths and frequencies in various realms of the spectrum of light and other soon to be recognized scientific data we now call paranormal. Is it paranormal? Probably not and it also might not be as weird as once thought. Such signals sent out before could pass to us in the future or we could intercept occasionally by plugging into or listening or attempting to receive such data.

Should we do this? Yes, it appears that the linear time we appear to so easily understand is more difficult a concept to comprehend than the fact that all time is one time and we are also living and thinking in all time periods simultaneously having only the foresight to see and touch what we believe to be present time. Time is a very cool concept. When people ask are you living in the past, present or future, you answer should be YES> Yes We are. We because if we are all living in all time then we are actually all one since we are made up of DNA parts of the entire and we are all related to the original of past and accumulation of the future. If we or an individual of we makes a certain action it affects all time yes past too. Since past and present and future are all one. Which is actually a much simpler concept and easier to comprehend than the questions so many wish to ask such as and so many religions attempt to answer without leaving loose ends where they might actually have to prove it, good book to read is the “God Gene”, oh here are those quesitons;

“Where did I come from”

“How did the world Start”

“What will happen to me when I die” and

“What will happen to the world when it ends”

Why do all religions move to answer these questions? Whether we understand the answers or not, or whether we choose to recognize it or not, we seem to innately need more answers, so then for piece of mind: Is it proper to communicate with the past and future? Should we tempt fate, will we believe what we find? Well in a way we already communicate thru time in a way. After all we live, have lived or will live in the present, past and future and whereas that does not span all that much in a linear time sense it still is living in and thru time. So may as well learn more and help us move forward as one. This is way off tangent from the original mind exercise of the theory of moving dust through time, or is it? Well the fact is we have questions and we seek answers and we have the technology to move into other dimensions now, so should we move to the nect step and seek the answers, do we dare? Does our society so depend on certain truths that we don’t dare? Are we confident enough in our beliefs to prove ourselves right? Wrong? Then what? Does society break down? Or do we as a species grow?

If you can move dust particles maybe you can move nano particles and thus move organic or building blocks through time to a place of your choosing. Move DNA, RNA, Nano-Devices to another place and another time to start life with the same components you wish to promote, ah ha so are we someone else’s experiment thru time, terra forming, a part of another. Sending instructions of how to form life to a place life will flourish one of Saturn’s Moons with water ice or under the surface of Mars where water exists does not take the capability of sending thru time, only to a near planet. If we can do this should we try for a greater distance or different time or is it “all relative” anyway and we are trapped here in linear time only because we have not broken the code yet? Will super computers and quantum mechanics allow us the next-step.

Perhaps local solar system moons with volcanic heat coming from the moon’s or planet’s interior or from a Sun near a planet is a good place to start or practice this concept since many exist in the similar fashion as Earth with similar ratios of elements. They may not contain the exact components that make up our atmosphere, but certainly enough to support some sort of life similar to our current DNA or current DNA we can find here on Earth. After all, human DNA and Wheat DNA are quite similar. But it is life as we know it and we could send such anywhere we want to, if we chose to and if we feel it appropriate to from an ethics standpoint. If people could throw away their notions of ugly and pretty based on their current perception from their viewpoint of how their eye sight and brain interpret beauty or substance then it might be easier and with proper understanding it is possible, sure we love blue rivers and blue sky, green forests, yet is that so important to negate the possibilities of suitable other worlds? We are missing the best part of reality when we deny ourselves to study and open our minds to the data before us. Regarding all the conspiracy theories of this


this is all fun and games, but the reality of this technology can really open the human mind to a much needed boost of input. And allow us harmony with all species and understanding of who we are and why we are here. Open your mind and vote for more research spending on these things. Think of a system set up to allow you to receive and email from a family member in the future or being able in the future to send and receive emails from the future, describing the future and warning about the future. Pretty cool. Think of this in the realm of 5000 years in the future and arrange meetings to visit it. This will be possible within the next 100 years or less at the rate of new technology discoveries.

What are the possibilities in the here and now for such incredible data and knowledge? Think in the future you could meet the past twenty generations of your family ask them questions and they you. Wouldn’t they be proud of you and wouldn’t it make you want to not let them down. Perhaps this might be the future of the village it takes to raise a child, you correctly. This has been a twilight zone thought of the day. What say you?

Wednesday, March 17th, 2010 Articles Comments Off on Particle Beams – Particle Beams Sending Specs of Dust Back in Time

Protons Energy – What happens to the energy when 2 protons collide head-on and 100s of particles fly out in all directions?

If you are talking about particle accelerators, a lot happens to the energy. Since the protons have so much kinetic energy when they collide, they don’t simply stay as protons. They turn into 100’s of particles, also moving quickly. Since E = mc^2, we don’t need to conserve just mass or just energy. In fact, if we total the mass times c^2 and the energy of all these particles, we would get the original two protons’ energy.

The resulting shower of particles then goes and hits or bounces off other things, which equipment in the particle accelerator is designed to detect, since it heats the equipment up ever so slightly. The rest just fly off until they decay into something stable.

Tags: ,

Monday, March 15th, 2010 Questions Comments Off on Protons Energy – What happens to the energy when 2 protons collide head-on and 100s of particles fly out in all directions?

Big Bang Machine – Will the big bang machine or black hole machine destroy the world in 2010?

i mean ive been going crazy about 2012 and this site has helped me a lot about what the calender means so uh well thanks a lot guys but well i searched black hole machine and someone said they will launch next year in 2012 in september? please help me undertsand im so confused and scared all the time.

You’re probably talking about the scare-stories some people are throwing around about the Large Hadron Collider, a new and very high energy accelerator.

That seems to be the latest thing every time there’s a new higher-energy accelerator experiment planned. “OMG OMG they’re going to make a black hole and destroy the earth” or “OMG OMG they’re going to re-create the Big Bang and destroy the earth”. None of this has any actual basis in physics, it’s just half-baked misunderstandings based on misreadings of something from the press releases.

Then the experiment happens and the scare stories subside. Then there’s a new accelerator experiment planned and it starts all over again.

This is all just as unscientific and nonsensical as the 2012 stuff. The difference is that there are a couple of people with scientific background who apparently appear on these scare sites. However, their scientific background tends to be in non-physics fields such as botany. They don’t know any more about the physics than the average layman.


supplied by Yahoo answers

Tags: , ,

Saturday, March 13th, 2010 Questions Comments Off on Big Bang Machine – Will the big bang machine or black hole machine destroy the world in 2010?

The Voltaire Lecture 2010 by Professor Brian Cox

Professor Brian Cox speaks on “The value of Big Science: CERN, the LHC and the exploration of the Universe”

6th April 2010

Chaired by Polly Toynbee, President of the British Humanist Association.

Professor Brian Cox holds a chair in particle physics at the University of Manchester and works on the ATLAS experiment at the Large Hadron Collider, CERN near Geneva.

Full details and tickets available here

Tags: , , , ,

Friday, March 12th, 2010 Articles Comments Off on The Voltaire Lecture 2010 by Professor Brian Cox

Lhc – Quarks, Big Bang and Large Hadron Collider (LHC)

People are curios. Curiosity is one of the key elements that drives humanity towards the answers about our existence and our future. Since ancient times people have asked themselves about the origins of everything. From universe to the basic elements of the matter. Some people many thousands years ago assumed that if you divide some piece of matter this division must come to an end. This process should end with the basic, indivisible elements that constitute matter–atoms.

In the last centuries many experiments have confirmed that the matter is indeed consisted of some small particles. Scientific approach has contributed to the discovery of various natural and synthetic substances, molecules, chemical elements and atoms. Atoms, once believed to be indivisible, were also found to have some hard nucleus with electrons orbiting around it. Then it was discovered that the atom nucleus is consisted of protons and neutrons. So the atoms are divisible. This fact had many consequences. One of them with the most notable effect is fission nuclear bomb. However, the division story didn’t end there. Protons and neutrons were also found to contain some smaller particles–quarks.

Currently the list of all elementary particles is pretty long. This list is part of the Standard model–a model of how everything exists and interacts. It is believed that this model is not the final picture of the universe. There are still some unanswered questions. On the other hand, the universe itself is a subject of investigation. One of the key discoveries was that the universe is expanding. From this fact we can conclude that in the past the universe was smaller. The more we go into the past, the smaller it was. Sooner or later we come to the moment in time where the universe was infinitely small. This is called the Big Bang–the moment when the universe started to develop as we know it today, some 13.7 billion years ago. This is now the leading theory about the evolution of the universe. It is still unknown what banged, how and why.

The latest project to find some missing answers is the Large Hadron Collider (LHC) in CERN, Geneva. It is a giant ring 100 meters under ground where two beams of particles close to light speed will collide. Each collision will produce an enormous amount of other particles. Analysis of this debris will hopefully answer some questions about the nature of particles or even bring some new ones. Because of enormous collision energy (about 14 TeV) the circumstances will be close to the situation immediately after big bang. The LHC project is currently the largest and the most expensive scientific project.

Answering questions about micro and macro world will not only satisfy our curiosity but will also help us to understand the world. If we understand the world then we can make it better. And better world is a dream of everybody.

Article Source:

Tags: , , ,

Thursday, March 11th, 2010 Articles Comments Off on Lhc – Quarks, Big Bang and Large Hadron Collider (LHC)

High Energy Physics – Pure Derivation of the Exact Fine – Structure Constant and As a Ratio of Two Inexact Metric Constant

Theorists at the Strings Conference in July of 2000 were asked what mysteries remain to be revealed in the 21st century. Participants were invited to help formulate the ten most important unsolved problems in fundamental physics, which were finally selected and ranked by a distinguished panel of David Gross, Edward Witten and Michael Duff. No questions were more worthy than the first two problems respectively posed by Gross and Witten: #1: Are all the (measurable) dimensionless parameters that characterize the physical universe calculable in principle or are some merely determined by historical or quantum mechanical accident and incalculable? #2: How can quantum gravity help explain the origin of the universe?

A newspaper article about these millennial mysteries expressed some interesting comments about the #1 question. Perhaps Einstein indeed “put it more crisply: Did God have a choice in creating the universe?” – which summarizes quandary #2 as well. While certainly the Eternal One ‘may’ have had a ‘choice’ in Creation, the following arguments will conclude that the reply to Einstein’s question is an emphatic “No.” For even more certainly a full spectrum of unprecedented, precise fundamental physical parameters are demonstrably calculable within a single dimensionless Universal system that naturally comprises a literal “Monolith.”

Likewise the article went on to ask if the speed of light, Planck’s constant and electric charge are indiscriminately determined – “or do the values have to be what they are because of some deep, hidden logic. These kinds of questions come to a point with a conundrum involving a mysterious number called alpha. If you square the charge of the electron and then divide it by the speed of light times Planck’s (‘reduced’) constant (multiplied by 4p times the vacuum permittivity), all the (metric) dimensions (of mass, time and distance) cancel out, yielding a so-called “pure number” – alpha, which is just over 1/137. But why is it not precisely 1/137 or some other value entirely? Physicists and even mystics have tried in vain to explain why.”

Which is to say that while constants such as a fundamental particle mass can be expressed as a dimensionless relationship relative to the Planck scale or ratio to a somewhat more precisely known or available unit of mass, the inverse of the electromagnetic coupling constant alpha is uniquely dimensionless as a pure ‘fine-structure number’ a ~ 137.036. On the other hand, assuming a unique, invariantly discrete or exact fine-structure numeric exists as a “literal constant,” the value must still be empirically confirmed as a ratio of two inexactly determinable ‘metric constants,’ h-bar and electric charge e (light speed c being exactly defined in the 1983 adoption of the SI convention as an integer number of meters per second.)

So though this conundrum has been deeply puzzling almost from its inception, my impression upon reading this article in a morning paper was utter amazement a numerological issue of invariance merited such distinction by eminent modern authorities. For I’d been obliquely obsessed with the fs-number in the context of my colleague A. J. Meyer’s model for a number of years, but had come to accept it’s experimental determination in practice, pondering the dimensionless issue periodically to no avail. Gross’s question thus served as a catalyst from my complacency; recognizing a unique position as the only fellow who could provide a categorically complete and consistent answer in the context of Meyer’s main fundamental parameter. Still, my pretentious instincts led to two months of inane intellectual posturing until sanely repeating a simple procedure explored a few years earlier. I merely looked at the result using the 98-00 CODATA value of a, and the following solution immediately struck with full heuristic force.

For the fine-structure ratio effectively quantizes (via h-bar) the electromagnetic coupling between a discrete unit of electric charge (e) and a photon of light; in the same sense an integer is discretely ‘quantized’ compared to the ‘fractional continuum’ between it and 240 or 242. One can easily see what this means by considering another integer, 203, from which we subtract the 2-based exponential of the square of 2pi. Now add the inverse of 241 to the resultant number, multiplying the product by the natural log of 2. It follows that this pure calculation of the fine-structure number exactly equals 137.0359996502301… – which here (/100) is given to 15, but is calculable to any number of decimal places.

By comparison, given the experimental uncertainty in h-bar and e, the NIST evaluation varies up or down around the mid 6 of ‘965’ in the invariant sequence defined above. The following table according gives the values of h-bar, e, their calculated ratio as and the actual NIST choice for a in each year of their archives, as well as the 1973 CODATA, where the standard two digit +/- experimental uncertainty is in bold type within parentheses.

year…h- = Nh*10^-34 Js…… e = Ne*10^-19 C….. h/e^2 = a =….. NIST value & ±(SD):

2006: 1.054.571 628(053) 1.602.176 487(040) 137.035.999.661 137.035.999 679(094)

2002: 1.054.571 680(18x) 1.602.176 53o(14o) 137.035.999.062 137.035.999 11o(46o)

1998: 1.054.571 596(082) 1.602.176 462(063) 137.035.999.779 137.035.999 76o(50o)

1986: 1.054.572 66x(63x) 1.602.177 33x(49x) 137.035.989.558 137.035.989 5xx(61xx)

1973: 1.054.588 7xx(57xx) 1.602.189 2xx(46xx) 137.036. 04x(11x)

So it seems the NIST choice is roughly determined by the measured values for h and e alone. However, as explained at, by the 80’s interest shifted to a new approach that provides a direct determination of a by exploiting the quantum Hall effect, as independently corroborated with both theory and experiment of the electron magnetic-moment anomaly, thus reducing its already finer tuned uncertainty. Yet it took 20 years before an improved measure of the magnetic moment g/2-factor was published in mid 2006, where this group’s (led by Gabrielse for Hussle at first estimate for a was (A:) 137.035999. 710(096) – explaining the much reduced uncertainty in the new NIST list, as compared to that in h-bar and e. However, more recently a numeric error in the initial QED calculation (A:) was discovered (we’ll refer to it as 2nd paper B:) which shifted the value of a to (B:) 137.035999. 070 (098).

Though it reflects a nearly identically small uncertainty, this assessment is clearly outside the NIST value concordant with estimates for h-bar and elementary charge, which are independently determined by various experiments. The NIST has three years to sort this out, but meantime face an embarrassing irony in that at least the 06-choices for h-bar and e seem to be slightly skewed toward the expected fit for a! For example, adjusting the last three digits of the 06-data for h and e to accord with our pure fs-number yields an imperceivable adjustment to e alone into the ratio h628/e487.065. Had the QCD error been corrected prior to the actual NIST publication in 2007, it rather easily could have been evenly adjusted to h626/e489; though questioning its coherency in the last 3-digits of a with respect to the comparative 02 and 98 data. In any case, far vaster improvements in multiple experimental designs will be required for a comparable reduction in error for h and e in order to settle this issue for good.

But again, even then no matter how ‘precisely’ metric measure is maintained, it’s still infinitely short of ‘literal exactitude,’ while our pure fs-number fits the present values of h628/e487quite precisely. In the former regard, I recently discovered a mathematician named James Gilson (see ) also devised a pure numeric = 137.0359997867… nearer the revised 98-01 standard. Gilson further contends he’s calculated numerous parameters of the standard model such as the dimensionless ratio between the masses of a Z and W weak gauge boson. But I know he could never construct a single Proof employing equivalences capable of deriving Z and/or W masses per se from then precisely confirmed masses of heavy quarks and Higgs fields (see essay referenced in the resource box), which themselves result from a single over-riding dimensionless tautology. For the numeric discreteness of the fraction 1/241 allows one to construct physically meaningful dimensionless equations. If one instead took Gilson’s numerology, or the refined empirical value of Gabreilse et. al., for the fs-number, it would destroy this discreteness, precise self-consistency and ability to even write a meaningful dimensionless equation! By contrast, perhaps it’s then not too surprising that after I literally ‘found’ the integer 241 and derived the exact fine-structure number from the resultant ‘Monolith Number,’ it took only about 2 weeks to calculate all six quark masses utilizing real dimensionless analysis and various fine-structured relations.

But as we now aren’t really talking about the fine-structure number per se any more than the integer 137, the result definitively answers Gross’s question. For those “dimensionless parameters that characterize the physical universe” (including alpha) are ratios between selected metric parameters that lack a single unified dimensionless system of mapping from which metric parameters like particle masses are calculated from set equations. The ‘standard model’ gives one a single system of parameters, but no means to calculate or predict any one and/or all within a single system – thus the experimental parameters are put in by hand arbitrarily.

Final irony: I’m doomed to be demeaned as a ‘numerologist’ by ‘experimentalists’ who continually fail to recognize a hard empirical proof for quark, Higgs or hadron masses that may be used to exactly calculate the present standard for the most precisely known and heaviest mass in high energy physics (the Z). So contraire foolish ghouls: empiric confirmation is just the final cherry the chef puts on top before he presents a “Pudding Proof” no sentient being could resist just because he didn’t assemble it himself, so instead makes a mimicked mess the real deal doesn’t resemble. For the base of this pudding is made from melons I call Mumbers, which are really just numbers, pure and simple!

Tags: , , ,

Thursday, March 11th, 2010 Articles Comments Off on High Energy Physics – Pure Derivation of the Exact Fine – Structure Constant and As a Ratio of Two Inexact Metric Constant

The LHC 2010 – 2011

The Large Hadron Collider overview for the next 18 to 24 months with start of physics at 7 TeV by Rolf Heuer, Director General of CERN

read the full article here

Tags: , ,

Wednesday, March 10th, 2010 Articles Comments Off on The LHC 2010 – 2011