Nature, however, always has the last word. Is supersymmetry a property of the physical world, or just interesting mathematics? As yet there is no direct evidence for supersymmetry. It is attractive to theorists both for its elegance and because it makes certain features of the Standard Model occur more naturally.
At best it is imperfectly realized. Perfect symmetry requires equal mass pairings of particles and their superpartners. No such pairings are found among the known particles, and thus a whole family of superpartners must be postulated.
However, valid symmetries of the fundamental laws can be obscured by the existence of pervasive fields called condensates in the vacuum. Supersymmetry is such a hidden symmetry. All the superpartners of the known particles can only be as-yet-undiscovered massive particles, and many versions of supersymmetry, in particular those that account best for the merging of the three couplings, predict that these particles should be found at masses accessible with existing or planned accelerators.
Searches for these particles may soon reveal or exclude these versions of supersymmetry theory.
Expansion Theory – Our best candidate for a Final Theory of Everything?
Because the superpartners are expected to be very massive, they have not yet been directly observed. The neutralino is discussed further in later chapters. It incorporated several concepts quite new to physics, including the curvature of space-time and the bending of light, and led to the prediction of other completely new phenomena, including gravitational radiation, the expanding universe, and black holes.
General relativity was widely accepted and admired by physicists almost from the start. For many years, however, general relativity was not very relevant to the rest of physics; it made few testable new predictions. Well into the s, textbooks spoke of only three tests of relativity namely, the advance of the perihelion of Mercury, the gravitational redshift of light when photons are emitted from the Sun and other massive bodies, and the bending of light by the Sun. In recent years the picture has changed completely, mainly because of revolutionary improvements in high-precision instrumentation and in the observational techniques of astronomy.
Blackbody Radiation and Planck's Equation
The accuracy of each of these three tests now exceeds a part in a thousand, and numerous new precise tests, unimagined by Einstein, have been successfully performed. Over a thousand neutron stars have been found in the form of pulsars; their gravitational fields can be adequately described using only general relativity. A binary system containing a very remarkable pulsar was discovered in by Russell Hulse and Joseph Taylor and then studied extensively by Taylor and other collaborators. The orbital motion of the pulsar was measured with great accuracy, thereby enabling precision tests of gen-.
Most dramatic was the observation that as a consequence of the emission of energy into gravitational radiation, the total energy of the orbital motion decreases with time at a rate predicted by general relativity. The agreement is better than one-third of 1 percent. Today, general relativistic corrections to the flow of time on orbiting satellites as compared with the rate on Earth are an essential part of the Global Positioning System GPS , which allows commercial and military users to calculate a precise location on the surface of the Earth and to transfer accurate time readings using triangulation with satellite signals.
Numerous convincing black hole candidates have been identified through astronomical observations. They fall into two classes. One class, arising from the collapse of stars, has masses ranging from a few times that of the Sun to around 10 times that of the Sun and radii of a few kilometers. The second class, typically found at the centers of galaxies, can have masses millions to billions of times that of the Sun, and radii comparable to that of the solar system. There is compelling evidence that our own galaxy contains such a black hole.
It is probable that the most violently energetic objects in the universe, the quasars, are powered by accretion of matter onto such gigantic spinning black holes. Developments in general relativistic cosmology have been still more remarkable. The theory of the expanding universe has been resoundingly verified by observation of the velocities of distant objects. The gravitational redshift of spectral lines is evolving: once an exotic, difficult test of general relativity, it is becoming a standard tool of astronomy.
The bending of light, first observed as a tiny apparent angular displacement for a star with light grazing the Sun during a solar eclipse, is now the basis of a fruitful technique to map dark matter using gravitational lensing. The mass of intervening galaxies is observed, in many cases, to distort the light from more distant sources drastically, even producing multiple images.
This provides a way to search for massive objects that produce no light or other detectable radiation. In all these applications, the predictions of general relativity have not been contradicted. Despite these great successes, there are compelling reasons to think that general relativity is not the last word on gravity. A primary stumbling block for understanding the physics of the very earliest moments and the birth of the universe itself is the lack of progress in developing a consistent theory that includes both general relativity and.
The difficulties are similar to, but more severe than, the difficulties discussed above in connection with the history of quantum electrodynamics. All the successful applications of general relativity have used equations that ignore quantum corrections. When the corrections are actually calculated, they are found to be infinite. Again, one can follow the procedure used in QED and improve the calculation by taking into account the effects of virtual particle clouds on, say, the interaction of a particle with gravity.
However, although the infinities are then avoided, the ability to calculate the behavior of the particle at high energies is lost, because the clouds interact strongly and in a very complex manner. Rather they radiate. The radiation rate is far too small to be detectable for any of the known black holes, but it has serious consequences. A fundamental requirement of quantum mechanics is a specific connection between the future and the past. But if black holes, which have swallowed objects carrying quantum information from the past, can evaporate by radiating in a random fashion, this connection is apparently broken.
Many believe this leads to a paradox whose ultimate resolution will bring deep insights into the quantum nature of space and time. Indeed, the particular size and shape of our solar system almost certainly arose from the specific details of its history; other planetary systems elsewhere are quite different. Yet the universe as a whole has some strikingly simple features. Such features beg for a theory to explain them. Among the most striking features of the universe as a whole are its approximate homogeneity and its spatial flatness.
Homogeneity means that any large region of the universe of a given age looks very much like any other large region at the same age. Spatial flatness means that space as opposed to space-time is not curved on large scales. Both of these properties of the universe have now been observed and measured with considerable precision, through study of the microwave background radiation. Neither homogeneity nor spatial flatness is required by classical general relativity, but they are allowed.
The question then arises, Why is our universe so homogeneous and flat? It is possible that these properties would emerge from a correct, quantum-mechanical treatment of the earliest mo-. But since no one knows how to calculate the behavior of quantum gravity at high energies, such speculation is difficult to test, or even codify. Some physicists believe that these problems can be solved by delving more deeply into general relativity itself. But others believe that the solution will necessarily involve an integration of gravity with the other forces of nature.
As the discussion below indicates, some intriguing progress has recently been made toward a synthesis of general relativity, the theory of space-time, with our current understanding of the other forces of nature. In most laboratory situations, gravity, as a force between elementary particles, is very much weaker than the strong, the electromagnetic, and even the weak interactions. For this reason, it has been possible to develop an extremely complete and accurate theory of subatomic and subnuclear processes the Standard Model while ignoring gravity altogether.
But since all objects attract one another gravitationally, the power of gravity is cumulative; on cosmic scales it comes to dominate the other forces. Traditionally, there has been a division of labor between the study of matter, on the one hand, and the study of gravitation and cosmology, on the other. A major theme of this report, however, is that this division is becoming increasingly artificial. Physicists, eager to build on their successful theories of matter and of space-time, seek to create an overarching theory of space-time-matter.
To understand the earliest times in the universe and the extreme conditions near black holes will require such a theory. New approaches to tackle these problems are, as yet, speculative and immature. However, the consequences for the view of the universe and for its history at the earliest times are profound. The homogeneity and spatial flatness of the universe can both be explained by assuming that the universe, early in its history, underwent a period of exceptionally rapid expansion.
Expansion tends to diminish the spatial curvature, just as blowing up a balloon makes its surface appear flatter. The enormous expansion associated with inflation means that the universe we see today began from a very tiny region of space that could have been smooth before inflation. While inflation cannot completely eliminate the dependence of the state of the universe today upon its initial state, it does greatly lessen that dependence. Inflation theory is more plausible, and exciting, because it can be grounded in particle physics ideas about unification and symmetry breaking.
The unified theories require the existence of condensates, whose space-filling presence makes the symmetry of the fundamental equations less manifest but more consistent with observation.
Introduction to Theoretical Physics
It also occurs in a somewhat different way in the strong interaction, in the theory of superconductivity, and in many other examples in the physics of materials. It is not an extravagant novelty. In all physical examples of condensates, when the temperature is raised sufficiently the condensate evaporates or melts away. Such a phase transition occurs when ice melts to become water. The laws of physics at the higher temperature look quite different—they have more symmetry.
Another example may be useful. In an ordinary magnet, the spins of the atoms are aligned at low temperatures because the total energy of the system is lower in such a configuration. This alignment obscures the isotropy of space by making it appear that there is a preferred direction the direction in which the spins are aligned. At high temperatures, the energy advantage associated with aligned spins is no longer important, the spins of the individual atoms are no longer aligned, and the isotropy of space is no longer obscured the broken symmetry at low temperatures is restored at high temperatures.
In a cosmological context, the consequences of a phase transition can be dramatic. That energy is in a very unusual form—not as particle mass or motion but as field energy, or false vacuum energy. False vacuum energy has quite different gravitational properties from other forms of energy. It turns out that if a large amount of vacuum energy is dissipated only slowly, it causes a period of inflation, of exponentially rapid expansion of the universe. As is discussed in later chapters, observational cosmology has recently yielded powerful hints that inflation occurred.
The ideas of particle physics suggest why it might have occurred. But as yet there is no single convincing, specific model for inflation. Existing models contain many arbitrary assumptions and are ad hoc. While they show that inflation has a plausible theoretical basis, they are certainly unsatisfactory in detail. Thus to understand properly this central facet of cosmology may require the development of a more complete unified theory of gravity and matter. Another simple yet profound property of the known universe is that it is made of matter rather than antimatter.
More specifically, distant stars and galaxies are all made out of protons and neutrons, while antiprotons and antineutrons are very rare in the universe. In the Standard Model, at low temperature, the number of protons minus antiprotons or, to be more precise, the number of quarks minus antiquarks cannot change. If that were the whole story, the asymmetry between matter and antimatter would simply be an initial condition from the big bang, locked in for all time. There would be no deeper explanation of it, nor any deduction of its magnitude from the laws of physics.
But the unified theories, as discussed above, include interactions that change quarks into antiquarks or other particles. Thus the number of quarks minus antiquarks is not frozen in; rather, it can evolve with time. Indeed, if any such processes occur, then at sufficiently high temperature symmetry will be restored, and there will be equal numbers of quarks and antiquarks. The present-day universe, where matter dominates antimatter, must have evolved from past equality.
So the stage is set for a dynamical calculation of the universal difference between quark and antiquark densities. Many models have been considered. With some assumptions, it is possible to achieve agreement with observation, although not with the Standard Model alone. As was the case for inflation, in order to develop a proper, convincing theory of matter-antimatter asymmetry, physicists need a deeper theory.
Perhaps the most tangible hint for new physics from cosmology is the problem of dark matter. A wide variety of astronomical measurements indicate that the universe contains considerably more matter than can be accounted for by ordinary matter in all forms e. This additional mass dark matter is not seen directly but rather through the effect of its gravity on the motion or the light of visible objects.
Introduction to Theoretical Physics - Wikibooks, open books for an open world
Here arises a truly extraordinary opportunity for discovery—what is this stuff that makes up most of the universe by weight? To heighten the tension, developments in particle physics suggest two quite specific, very plausible candidate particles. Indeed, each of these candidates was proposed for theoretical reasons unrelated to the dark mass problem, and only later was their potential to solve this problem realized.
One candidate arises from the idea of supersymmetry. One of these is a light, stable, neutral fermion called the neutralino. This is a leading candidate for a major component of the dark matter. The other leading candidate is a hypothetical particle called the axion. It appears as a consequence of theoretical extensions introduced to solve a quite different problem in the foundations of the Standard Model.
The axion is a very light particle but could have been produced very copiously during the big bang. The special detectors needed to search for axions are very different in detail from those that can search for neutralinos. But, as in the neutralino case, first-generation experiments exist, and improvements to reach the needed sensitivity are feasible.
Finally, the most intriguing and most recent hint for new physics from cosmology is the observation that the expansion of the universe is speeding up, rather than slowing down. If correct, this indicates the presence of a mysterious energy form—dark energy—that pervades the universe with a gravitational effect that is repulsive rather than attractive. While particle physics has had much to say about dark matter, thus far it has shed little or no light on dark energy. Nonetheless, it seems clear that a fundamental understanding of this new form of energy will require new physics. Dark energy and dark matter are discussed at greater length in Chapter 5.
Theoretical physicists have long sought to extend the range of applicability of their theories, synthesize the explanations of diverse physical phenomena, and unify the underlying principles. After the towering achievements of the last century, briefly reviewed in the previous sections, there is better material to work with than ever before—a remarkably detailed, specific, and powerful theory of matter, and a beautiful, fruitful theory of space-time. Can they be joined together? There are good reasons to be optimistic.
This discussion has reviewed how the unification of interaction strengths could arise, despite their different observed values, as a result of the effects of quantum corrections. The underlying equality of the strong, weak, and electromagnetic couplings emerges only when they are extrapolated to very high energy. Extending this calculation to include the gravitational coupling as well yields a delightful surprise: The extrapolated gravitational coupling meets the others, at nearly the same high energy see Figure 2.
Is nature hinting at unification of all forces? The most ambitious and impressive attempts to construct a unified space-time-matter theory involve an evolving set of ideas known variously as string theory, superstring theory, or M theory. String theory is not yet fully developed; so far, no specific predictions about the physical world have emerged. But even the current partial understanding suggests to many physicists that string theory may be on the right track. This report is not able to do justice to what has become a large and mathematically sophisticated body of work; it confines itself to a few brief indications.
Remarkably, this theory predicts the existence of gravity. Moreover, the resulting theory of gravity, unlike conventional general relativity, does not suffer from the problem of infinite quantum corrections. Further, it appears that string theory avoids the apparent paradox associated with Hawking radiation, by showing that the radiation emitted from black holes is not at all random.
Thus string theory offers the only known solution to two major theoretical problems that emerge when quantum mechanics is applied to gravity. Clearly, this is a direction to be pursued. String theory is most naturally formulated in 10 or 11 space-time dimensions; it cannot be made consistent using only the observed 4. In constructing models of the physical world one must assume that most of these dimensions somehow curl up, leaving the familiar 4 3 space, 1 time extended dimensions.
At first this may sound artificial, but many solutions of the theory having this behavior are known. Some even contain particles and interactions that broadly resemble the Standard Model, and they can incorporate low-energy supersymmetry, unification of couplings, and axions. Unfortunately there are also apparently equally valid solutions that do not have these features.
No completely satisfactory theoretical reason for preferring one model to another has yet emerged. Nor is any single known solution empirically satisfactory in all respects.
- Topics in Optimization.
- What Is Quantum Mechanics?.
A key feature of string theory is supersymmetry, the symmetry that relates matter particles and the force carriers see Box 2. Finally, any theory of space-time-matter must address what seems at present to be the most profoundly mysterious question in physical science. Researchers know that the vacuum state is anything but trivial: it is popu-.
One might think all this structure would contain energy. The definition of zero energy can be arbitrarily adjusted in many theories, but once the adjustment is made in one epoch of the universe it cannot be altered. One would therefore expect the effects of quantum corrections to give a vacuum energy in all epochs. Indeed, as argued above, this can account for the early inflationary epoch. Straightforward estimates of the expected scale of this energy in the present epoch give values far in excess of what is allowed experimentally.
This is called the problem of the cosmological constant, because the mathematical description of the energy of the vacuum is equivalent to the cosmological constant originally introduced by Einstein to keep the universe from expanding or contracting. The discrepancy, depending on how the estimates are made, is at least a factor of 10 55 , and indicates a major gap in understanding of the vacuum and gravity. Until very recently, it seemed reasonable to hope that some yet-undiscovered symmetry would require that all the sources of energy must cancel, so that empty space would have exactly zero energy today.
But recent measurements indicate that the energy of the vacuum, while absurdly small compared with the theoretical estimates mentioned above, is not zero see Chapter 4. For the other problems mentioned here, physicists have identified some very promising lines of attack. But for the cosmological constant problem, some fundamentally new idea seems required.
Many of the challenging questions today could not even be formulated only a few years ago. The experimental and observational data and techniques at hand today are beginning to provide access to information directly relevant to our questions. With this new way to envision light, Einstein offered insights into the behavior of nine different phenomena, including the specific colors that Planck described being emitted from a light-bulb filament.
It also explained how certain colors of light could eject electrons off metal surfaces, a phenomenon known as the "photoelectric effect. In a paper, "The Photoelectric Effect: Rehabilitating the Story for the Physics Classroom," Klassen states that Einstein's energy quanta aren't necessary for explaining all of those nine phenomena. Certain mathematical treatments of light as a wave are still capable of describing both the specific colors that Planck described being emitted from a light-bulb filament and the photoelectric effect.
Roughly two decades after Einstein's paper, the term " photon " was popularized for describing energy quanta, thanks to the work of Arthur Compton, who showed that light scattered by an electron beam changed in color. This showed that particles of light photons were indeed colliding with particles of matter electrons , thus confirming Einstein's hypothesis. By now, it was clear that light could behave both as a wave and a particle, placing light's "wave-particle duality" into the foundation of QM. Since the discovery of the electron in , evidence that all matter existed in the form of particles was slowly building.
Perhaps wave-particle duality could ring true for matter as well? The first scientist to make substantial headway with this reasoning was a French physicist named Louis de Broglie. One stipulation of the new model was that the ends of the wave that forms an electron must meet. In " Quantum Mechanics in Chemistry, 3rd Ed. Benjamin, , Melvin Hanna writes, "The imposition of the boundary conditions has restricted the energy to discrete values. Unlike the circular orbits of the Rutherford-Bohr model, atomic orbitals have a variety of shapes ranging from spheres to dumbbells to daisies.
This was yet another problem that had been unsolvable using the math of classical mechanics. These insights gave rise to the field of "quantum chemistry. Also in , Heisenberg made another major contribution to quantum physics. He reasoned that since matter acts as waves, some properties, such as an electron's position and speed, are "complementary," meaning there's a limit related to Planck's constant to how well the precision of each property can be known. This uncertainty principle applies to everyday-size objects as well, but is not noticeable because the lack of precision is extraordinarily tiny.
The principles of quantization, wave-particle duality and the uncertainty principle ushered in a new era for QM. In , Paul Dirac applied a quantum understanding of electric and magnetic fields to give rise to the study of "quantum field theory" QFT , which treated particles such as photons and electrons as excited states of an underlying physical field. The setup is a little more complicated than you probably thought though, and goes like this:. Inside a sealed box, a piece of radioactive rock is placed beside a Geiger counter that will register if any radioactivity is emitted.
Since radioactive emission is a spontaneous quantum process, with no cause, the rock is in a superposition of having emitted and not having emitted radiation until there is some kind of measurement. That Geiger counter is then connected to a hammer that will break a vial of cyanide if it falls. Registering any radiation will cause the hammer to fall, smashing the vial and unleashing the cyanide.
Aside from the complexity, this still sounds like basically what we all learned in our physics classes. So what is the problem with it? Well, primarily it comes down to measurement. Must a conscious mind be involved for the cat to live or die? However, in the quantum world, a particle really can be in two states at once, because each particle is existing as a probability function where each quantum state in their superposition is assigned a certain probability of being exhibited when measured. Hopefully, that gives you some sense of why the cat in the box is less than adequate in describing one of the most complex nuances in all of physics.
The uncertainty principle is one of the most famous and again, misunderstood ideas in physics. It tells us that there is a fuzziness in nature, a fundamental limit to what we can know about the behavior of quantum particles and, therefore, the smallest scales of nature. Among its many counter-intuitive ideas, the quantum theory proposed that energy was not continuous but instead came in discrete packets quanta and that light could be described as both a wave and a stream of these quanta.
In fleshing out this radical worldview, Heisenberg discovered a problem in the way that the basic physical properties of a particle in a quantum system could be measured. In one of his regular letters to a colleague, he presented the inklings of an idea that has since become a fundamental part of the quantum description of the world. The uncertainty principle says that we cannot measure the position and the momentum of a particle with absolute precision.
The more accurately we know one of these values, the less accurately we know the other. The uncertainty principle is at the heart of many things that we observe but cannot explain using classical non-quantum physics. One way to think about the uncertainty principle is as an extension of how we see and measure things in the everyday world.