Wednesday, October 30, 2013

Dark Matter

Dark matter is now such a mainstream concept in physics that it would be easy to assume that this matter actually exists. The reality of dark matter, however, is another story and in fact there are some theories that refute its existence altogether. One thing is confirmed: Something in the universe skews gravity's observed affects on radiation and the large-scale structure of objects such as galaxies.

Why Dark Matter?

These effects, calculated using Newton's and Einstein's gravity formulas, are off, and that discrepancy has been measured very accurately. Physicists have calculated exactly how much extra matter would be needed to get the observations back into line with the theoretical predictions. These results are shown in the top pie chart below.

A Few Notes On What Matter and Mass Are

The top pie chart represents the total matter and energy content of the current universe. Einstein famously described how matter and energy are equivalent and interchangeable (E = mc2). This is known as mass-energy equivalence. I am using matter and mass interchangeably in this article but I should remind you that I am not being quite accurate. Mass in physics is a much better defined term than matter is. Matter, described as having both mass and volume, is not as useful. For example, the Pauli exclusion principle explains why atoms of matter cannot overlap each other and therefore require space, but this becomes less relevant when we study the ultra-dense matter inside white dwarfs and neutron stars. When we study particles approaching the speed of light, mass also becomes tricky. We must deal with two kinds of mass - rest mass and relativistic mass. This is why in modern physics both mass and matter are better treated as energy-momentum tensors. These tensors describe not only matter at any point in space-time, but also radiation and force fields present (with the exception of gravity) as well.

The Universe's Matter and Energy Content Evolved

As far as anyone knows, the universe is an isolated system. This means that, according to the second law of thermodynamics, its total mass-energy has remained constant since its inception as the Big Bang. In the current universe, dark matter accounts for 26.8% of the total mass-energy of the universe, and over 84% of the universe's total matter. You will notice that most of the mass-energy of the universe is dark energy. Dark energy is even stranger than dark matter. It is a hypothetical form of energy that accounts for the observed acceleration of the expansion of the universe, and it makes up more than 68% of the total mass-energy of the current universe. Dark energy, in itself, acts as a homogenous negative pressure throughout the universe, and will be explored in detail in a future article. For now, let's take a micro-lesson in dark energy:

The lower pie chart describes the mass-energy makeup of the universe when it was very young, about 380,000 years old. It looks very different! There is negligible dark energy. We don't see any because the universe expansion rate did not begin to accelerate until about 5 billion years ago. Before that, the universe expansion rate was actually decelerating because of the attraction of dark matter and baryonic matter to itself and to each other. Baryonic matter is ordinary atomic matter made up of protons and neutrons (baryons). That gravitational attraction is still there in our current universe but it is now overwhelmed by the increasing influence of dark energy.

The simplest explanation for dark energy is that it is the cosmological constant. This means that it is fundamental to the nature of a vacuum, which is what outer space essentially is. A vacuum has intrinsic energy, sometimes called vacuum energy. Physicists know this energy exists because there are several lines of observational evidence for it, for example, the Casimir effect. This energy has negative pressure. Here, things might seem a little counter-intuitive because a vacuum acts differently than a volume of air, for example. If we take a sealed container full of air molecules at a specific temperature and allowed that container to expand, the temperature of the air would drop, according to the ideal gas law, because the total energy (for an ideal gas it is all treated as kinetic energy) of the molecules, which must remain constant, is now distributed over a larger volume. The average kinetic energy (which is the temperature) of the air drops. This is why rising (expanding) warm air cools in the upper atmosphere. In a similar way, what started out as gamma rays from the Big Bang now strike Earth over 13 billion years later as much lower energy microwaves. This process is technically called adiabatic expansion. A vacuum is different. It has an intrinsic fixed amount of energy (potential energy in this case) that depends only on its volume, so that a larger vacuum has more energy and a smaller vacuum has less energy. As the vacuum of the universe expands, the amount of vacuum energy increases, increasing the universe's negative pressure, and accelerating its expansion rate further. How this expansion does not break the second law of thermodynamics (remember that the total mass-energy of the universe should not change) is a mystery to be solved in the coming article.

This is our earliest visible glimpse of the universe, made possible when photons and electrons decoupled from each other. By about 380,000 years old, the universe had expanded and cooled enough so that photons, previously trapped in opaque electron-dense plasma, were able to stream outward in all directions. These photons are still streaming. They are what we detect as cosmic microwave radiation background.

Notice in the lower pie chart that the very young universe contained a significant photon component. Our current universe, in contrast, contains very little electromagnetic radiation or photons, most of it in the form of cosmic background radiation - the faint whisper leftover from the roar of the Big Bang. The early universe was flooded with very high-energy gamma rays. A smaller photon contribution comes from stars as well. Additional radiation from stars today is also much weaker than what once streamed from the first enormous and very brightly burning stars that formed around that time. As you can see in the pie chart, the very young universe actually contained more radiation than matter, so where did it all go? Radiation transformed into matter through a process called pair production. For example, two very high-energy photons (gamma rays,γ) can annihilate, forming a positron and electron, particles of matter. This is how quarks, the particles that make up protons and neutrons, formed as well. The reaction is actually reversible. It favours photon production in very high-energy environments and matter production at lower energy. This simple reaction really shows off how matter and energy are intimately related to each other. At around 380,000 years old, the universe's energy was low enough to favour matter production but high enough that photons had enough energy to transform into the rest mass of various particles.

γ + γ ↔ e-+ e+

According to this formula, matter and antimatter (the positron e+ is an example of antimatter; there are antiquarks as well) should have been produced in equal amounts, but we know that the current universe is dominated by matter. Particles of matter and their antimatter twins immediately annihilate upon contact with each other, so why is there any matter left at all? A slightly imperfect symmetry in the weak force, called CP violation accounts for it, luckily for us. How that works is a subject of another article called Antimatter.

What Dark Matter Does

In the 1930's, physicists discovered that the mass of galaxies calculated from their gravitational effects was far greater than the mass calculated from all the visible matter they contained, things such as stars, gas and dust. They coined the term "dark matter" for this mysterious invisible mass. Since then, more evidence pointing to dark matter has come from cosmology. For example, galaxies rotate much faster than they should, based on their observed luminous material.

This is an interesting problem and you might be surprised that it took so long to come to light. After all, the orbital velocities of planets around our Sun, for example, were worked out in precise detail centuries ago by Johannes Kepler, later incorporated into Newton's laws for classical mechanics. According to these laws, the orbital velocity of a body in a system with mass concentrated at the center, decreases with distance from that center, shown by the blue dotted line, A, in the graph below. A galaxy, though much larger and more massive, should work just the same way, and yet it doesn't. Stars far from the galactic center orbit with the same velocity as those very close to the center of the galaxy, as shown below by the red line, B.
PhilHibbs;Wikipedia

This leaves two options: First, there is far more mass than what is visible in the galaxy and this mass must not only envelop the galaxy's visible disc but extend far beyond its edge as well in a spherical halo in order to account for the almost flat velocity line. This allows rotational velocities to fall off far out into the halo, agreeing with Newtonian mechanics. Physicists have been able to map the dark matter halo that surrounds the Milky Way by measuring how dark matter alters the paths of smaller galaxies and star clusters that orbit our galaxy. It looks like a giant squashed beach ball, shown in the brief video below.



The second option is that Newtonian laws, though perfectly adequate for solar-size systems and smaller, do not describe gravitational behaviours of very massive large-scale objects. General relativity, the modern theory of gravitation, incorporates special relativity into Newton's law of universal gravitation, and gives physicists a new four-dimensional metric called space-time. Though this theory revolutionized the concept of gravity because it handles the passage of time in space, the motion of bodies in free fall and the propagation of light, it cannot describe this behaviour either.

A second line of evidence for dark matter comes from the phenomenon of gravitational lensing. Einstein's theory of general relativity predicts that light from a source is bent when it passes through a strong gravitational field. The figure below created by NASA shows how light from a galaxy directly behind a massive object (the gold sphere) is made visible as two twin images seen by Earth.


The orange arrows show two apparent locations for the galaxy. The white arrows show how light is bent from the actual position of the galaxy, around the massive object.

This effect was theoretically put together in the 1930's but it was not confirmed by observation until 1979, when a double image of a distant quasar was discovered. The intervening mass of a large elliptical galaxy between Earth and the quasar bends the light coming from the quasar into two images, shown as A and B in the image below right.

Matthias Langer; AIR-WKDo Aylin Esen and Ander Hosgar;Wikipedia
The gravitational lensing argument for dark matter is not that lensing occurs but that it's effect is far more pronounced than it should be, as if there is considerable additional "dark" mass in that galaxy.

A third line of evidence for dark matter comes from the cosmic microwave background (CMB). Below is an all-sky map of the CMB created from WMAP data.


CMB radiation, discovered in 1964, is a blackbody spectrum. When the universe was about 1 second old, it was a near perfect blackbody in thermal equilibrium, with a temperature of  about 1010 K. It had a near perfect ability to emit energy through radiation.  In 1992, physicists discovered that it is not the perfect blackbody you would, expect expanding equally in all directions from an initial point-like origin. The spectrum instead contains fluctuations called anisotropies. These anisotropies have been studied in increasing detail since then, and they provide a great deal of information about the composition and evolution of the universe. In general, the anisotropies match what you would expect if tiny thermal variations, coming from quantum variations in a very tiny space, are blown up like a balloon to universe size. The radiation we see is called the surface of last scattering. This is a spherical surface that represents the locations in space where the decoupling of photons from electrons occurred. The anisotropy of the CMB comes in two types - primary anisotropy, which comes from the last scattering surface and before - and secondary anisotropy, which comes from effects that came after last scattering. These effects come from interactions of the photons with hot gases and gravitational potentials, for example. Secondary anisotropy gives physicists tremendous information about the early evolution of the universe. There is evidence here of the dark age and of reionization caused by intense stellar wind from the very first stars, for example.

The structure of the anisotropies comes from two effects - acoustic oscillations and diffusion damping. Diffusion damping acted to reduce anisotropies as photons, still significantly scattered around by other particles in this hot dense plasma, travelled from hot regions of space into cooler regions, dragging along protons and electrons with them. Its effect is governed by the average diffusion length of photons which can be calculated accurately. It's the acoustic oscillations that are of special interest to us here. The plasma of the very early universe was extremely dense. Acoustic oscillations arose from the competition between baryons and photons within it. Photons exerted a pressure that tended to erase anisotropies and baryons (matter) are gravitationally attracted to each other so they tend to collapse into denser patches, increasing anisotropies. These two counteracting forces created a spherical oscillation in the density of the plasma, working exactly the same way as sound waves moving through air do, except that this "sound" wave was made of photons and baryons whereas a sound wave in air is made of air molecules. The oscillations created the CMB's characteristic spectrum peaks, shown below as you scroll down.

The peaks themselves are resonant frequencies of the oscillations of the plasma in the early universe. A lot of information is contained in these peaks. The angular scale of the first one, for example, gives us the curvature of the universe. The ratio of heights between the first peak and the next peak gives us the baryonic matter density of the universe. Below is a graph of this power spectrum (fluctuations in the temperature spectrum of the CMB) in terms of angular scale. Angular scale (also called multiple expansion or multiple moment) is a mathematical function that depends on angles. It gives you information about fields at distant points in relation to a single point source. It allows you to describe an expansion process in a three-dimensional space. You get the data by plotting the spectrum at different angular scales on the sky using ground and balloon data.

This data was gathered from several instruments: WMAP, Acbar, Boomerang, CBI and VSA. The solid line is a theoretical model.

The specific evidence for dark matter comes from the third and later peaks. Describing how this works requires a bit more background information.

Consider the oscillation we talked about. We will put dark matter into that scenario. Gravity pulled both dark matter and baryonic matter into the denser center of the oscillation. This oscillation is quantum-originated, so it was once quantum-sized, but it grew thanks to the rapid expansion of the universe. As it expanded, dark matter and baryonic matter increasingly collapsed into it. The current pattern of galaxy clusters in the universe is thought to be the leftover signature of not just one oscillation or "sound" ripple, but many overlapping ripples, like waves emanating from an object dropped in a pond, except in three dimensions rather than two.

Only baryonic matter was pushed back by photon pressure. (Nonbaryonic) dark matter doesn't interact with electromagnetic radiation so it continued to fall inward and stay inside each oscillation. This inward-outward baryonic dance gives each fluctuation its acoustical ring. The frequency of each oscillation depends on each fluctuation's size. It gives a temperature spectrum fluctuation because the baryonic matter heats up when it falls in and cools off when it is pushed back out, a spectrum signature, which the dark matter doesn't have. To describe how this translates into spectrum peak data, pretend that the universe contained only photons for a moment. After a perturbation reached its maximum compression, they would flow out, red-shifting along with the expanding universe as they went. This means that the gravitational potential would decay away (with the red-shift) and it would allow the temperature perturbation to be much higher than what it is. The third peak would therefore be much higher. Now consider a universe with only baryonic matter and no photons. (Nonrelativistic) matter doesn't red shift so the gravitational potential doesn't red-shift away and the fluctuations and peaks would be much lower than they are. Mass, in other words, reduces the power spectrum peak amplitude.

This peak data gives a measurement of the total nonrelativistic matter of the universe. These peaks in the power spectrum are much lower than they should be based on baryonic matter alone, translating into a mass that must be non-baryonic and must be about five times greater than the baryonic mass of the universe.

This data also offers a picture of the transition from a radiation-dominated universe to a matter-dominated universe, when photons no longer coupled with baryonic matter and they streamed away, relieving the pressure in the system. After decoupling, the only force acting on the baryons was gravity. Baryons, along with dark matter remaining at the center of each oscillation, formed an over-density of matter at both the original anisotropy site and in a spherical shell at a fixed radius away from it, sometimes called the sound horizon.

The baryon-photon dance is now frozen into the CMB, and the signature of dark matter can seen indirectly in this map where small distortions of the CMB reflect regions of dense matter where photons were gravitationally lensed along their long journey since they were decoupled billions of years ago. These distortions can be used to map the underlying distribution of dark matter in space. The evidence for dark matter is that these frozen fluctuations are about five times smaller than what baryonic matter alone can account for. To understand why this extra mass can't be hidden baryonic matter such as dark gas, black holes and faint planets, we need to look at how baryonic matter was created in the universe. The amount of baryonic matter in the universe is tightly restricted by the nature of Big Bang nucleosynthesis. This process of nucleosynthesis is covered in detail in the article How Atoms Are Made, but let's briefly review it here.

The hot plasma of the young universe contained, along with photons, the building block particles of atoms such as protons, neutrons and electrons as well as other (unstable) particles. As the universe expanded and cooled, high-energy photons were able to decouple and stream away in all directions, giving protons and neutrons a chance to stick together and create deuterium nuclei. Eventually, larger nuclei could form such as helium-3, helium-4 and lithium. This is called Big Bang nucleosynthesis. This process slowed down as the universe continued to expand because the density of the plasma decreased, offering fewer collision opportunities. The neutron is unstable by itself, with a lifetime of about 15 minutes. After that, all free neutrons were gone and nucleosynthesis came to a stop altogether. No elements larger than beryllium were formed. This brief window gives a strict maximum on how much baryonic matter could have formed in the universe. All larger atoms subsequently created inside stars and in supernovae were created from this limited supply of smaller atoms. Gravitational evidence for additional matter means that it cannot be baryonic in nature.

What Dark Matter Is: The Hunt For the Dark Matter Particle
 
The first obvious place physicists looked was for matter that is not easily detectable. It must not be luminous and therefore cannot be observed through a telescope. There are several sources of what is called baryonic dark matter (dark matter made of atoms) out there: non-luminous gas, black holes, neutron stars, white dwarfs, brown dwarfs, very faint stars and planets. These objects are collectively called massive compact halo objects (MACHOS). You might think that ultra-dense black holes alone might be enough to account for the effects of dark matter. There are two main reasons why dark matter can't be them or any other baryonic matter.

MACHOS account for only a very small fraction of baryonic matter, which itself is highly restricted by the calculations of Big Bang nucleosynthesis. As we saw earlier, they put a strict upper limit on the universe's total baryonic mass. Therefore, MACHOS cannot be enough to contribute the massive amount of dark matter. Second, the analysis of the tiny irregularities in the cosmic radiation background described above show that around 85% of the total matter in the universe does not interact with ordinary matter or with photons. It can't be baryonic, in other words.

Most physicists think dark matter must be some kind of nonbaryonic particle with mass and which is not easily detectable, which means it must only very weakly interact with electromagnetic radiation, if at all. There is no verified particle that matches this description except the neutrino. Like photons, neutrinos decoupled from the plasma of the early universe and began to stream freely in all directions. Physicists are looking for a comparable neutrino cosmic background, a much more difficult job since neutrinos interact only with the weak force and gravity. The effort, though, might reap huge dividends because neutrinos decoupled when universe was just two seconds old, offering a far earlier picture of the universe than photons (at 380,000 years) can. What will make these neutrinos even more difficult to detect is that they would be very low energy, unlike more easily detectable high-energy neutrinos streaming from the Sun and from supernovae. Cosmic background neutrinos should be around 1.95 K, whereas the photon cosmic background is about 2.73 K (absolute zero is 0 K). You might think they should be hotter particles since they decoupled from a much more energetic universe than photons did. Photons, neutrinos, electrons and positrons all existed in thermal equilibrium with each other, even after decoupling. What made photons warmer is the electron-positron annihilation that took place afterward (discussed earlier in this article). These annihilations, happening before photons decoupled and creating high-energy gamma rays, transferred energy to the cooler photons in the plasma. This difference in energy remained "frozen" in the two backgrounds ever since.

Neutrinos, like photons, still exist in the present universe. Like photons, neutrinos are stable particles - they didn't decay away - and, unlike photons, they must have at least some mass, a requirement of their flavour oscillation. There should be a large abundance of them in the universe. They hardly interact with baryonic matter and do not interact with photons. All this makes them good dark matter candidates. However, their current contribution (like photons) to the mass-energy of the universe is negligible. Except for neutrinos produced in stars, supernovas, etc., the vast majority possess very little energy. Physicists can also put a strict upper limit on neutrino mass, which means they make up less than 1% of the current mass of the universe. This far too low mass-energy places them out of contention for dark matter. Another problem with neutrinos is that they travel at very near the speed of light. Their near-light speed means that they tend to erase all but the largest scale dark matter fluctuations observed in the universe, rather than contribute to the pockets of denser dark matter where galaxies and galaxy clusters form.

The remaining dark matter candidate particle must be a hypothetical one - an axion or a supersymmetric particle is possible. Supersymmetric particles are explored in the article Supersymmetry. These theoretical particles solve more than one stubborn problem in physics, making them a popular contender for dark matter as well, because the lightest ones should be stable, so they persist in the universe today. The axion is a hypothetical particle that was introduced to solve something called the strong CP problem in physics. As a bonus, like the lightest supersymmetric particles, it is an attractive particle candidate for dark matter. It should have mass and it should be stable. This is how the axion arises: I mentioned CP (charge parity) violation in the weak force earlier. The problem is that the weak interaction should feed into the strong interaction (the force that holds nuclei together) according to quantum chromodynamics theory. This should create a fairly large strong CP violation but no violation at all has been observed. A solution to the problem is to introduce something called the Peccei-Quinn mechanism to the mathematics. This mechanism introduces a new global symmetry to the Standard Model, which is spontaneously broken. This symmetry breaking introduces a new boson particle (the axion) that mathematically fills the role of the large strong CP violating term. If you have recently read Gauge Theory or Supersymmetry, you will certainly notice this technique has a familiar ring to it. By doing so, this relaxes the CP violation parameter to zero, bringing it into agreement with observation. Something called non-trivial QCD vacuum effects (which means that quarks, the building block particles of protons and neutrons, play an important role in shaping the structure of the quantum vacuum) in the mathematics make the symmetry just imperfect enough to impart a mass on the axion, and this is where its potential as a dark matter particle comes in.

Several experiments since the 1980's have been designed to detect the axion cosmologically (in space). These experiments are trying to find what is called the Primakoff effect. According to theory, a strong electromagnetic field should be able to convert axions into photons and vice versa. The Sun's core, for example, should produce lots of axions as X-ray photons scatter off electrons and protons inside powerful electric fields. The CAST experiment is designed to detect these solar axions by converting them back into X-rays using a strong magnetic field. It came online in 2003, but as of 2006 it has not found any evidence for axions. Built in 1983, the Axion Dark Matter Experiment, utilizing the same general concept, likewise has not detected any axions. It is currently undergoing an upgrade to increase its sensitivity. In short, the axion has not been definitively ruled out quite yet but it seems to be on thin ice.

Two other hypothetical candidates come from supersymmetry - the lightest neutralino and the sneutrino, two particles that should have mass and they should be stable. As we saw in the article Supersymmetry, neither of these particles has been detected within their expected mass range inside supercolliders, putting them too on thin ice. However, the LHC is currently being upgraded to achieve enough energy that should either prove or disprove their existence.

Hot, Warm and Cold Dark Matter

You may have seen these dark matter classifications before. They are losing relevance in current physics. All of the nonbaryonic dark matter particle candidates can be classified as either cold, warm or hot dark matter. Hot dark matter consists of particles that were moving close to light speed, such as neutrinos, when clumps of matter that would form galaxies and galaxy clusters began to form. Cold dark matter consists of particles that were moving much slower than light speed at the time of galaxy formation. Warm dark matter is made of particles with intermediate velocities. Hot dark matter now seems unlikely because any clumps that were galaxy size and smaller would have been quickly dispersed by these whizzing dark matter particles. As mentioned earlier, neutrinos were and are in abundance in the universe, and they did and do have these dispersal effects, but they have too little mass-energy to contribute to dark matter. If they had enough mass-energy to contribute to dark matter, only clouds with the mass of thousands of galaxies would have stood a chance. This would have significantly delayed the formation of the galaxies we see today or perhaps even prevented them altogether. Cold dark matter particles, on the other hand, could form galaxy-sized and smaller clouds, allowing galaxies to form first, followed by galaxy clusters as galaxies later merge. Chandra observations support this order of galaxy cluster formation, rather than fragmentation, which would have had to occur with hot dark matter, suggesting that cold dark matter is the only realistic scenario. Cold dark matter particles are MACHOS, and hypothetical particles such as axions, neutralinos and sneutrinos, all particles with mass that travel significantly below light speed.

What If Dark Matter Isn't Matter at All?

No particle clearly stands out as a dark matter candidate, and the window of possibility for detecting them is closing in as experiments become more and more sensitive and powerful. And yet something either interacts with gravity or skews its effects on very large-scale structures. It seems increasingly reasonable to consider that gravity itself may hold the answers. The article Gravity compares Newtonian gravity with Einstein's theory of general relativity if you would like to review them first. Many physicists are reluctant to consider this option because general relativity works so beautifully, aligning observation with theory for almost every phenomenon in physics, except the ones described here.

We would have to consider current theories for gravity as incomplete and, despite the utility of general relativity, there is much food for thought to suggest that it isn't the whole story. For example, gravity does not fit nicely next to the other fundamental forces because it is many scales weaker than they are, and it does not fit into the Standard Model at all - it has no place in quantum mechanics. There is no gauge particle or gauge theory that seems to work for it. Einstein's theory of general relativity describes gravity extremely well as long as physicists are dealing with situations larger scale than an atom - and, possibly, smaller scale than a galaxy.

The first attempt to modify gravity in order to fit galactic rotational velocities was Mordehai Milgrom's Modified Newtonian Dynamics, or MOND, in 1983. This (non-relativistic) model creates a stronger gravitational field when gravitational acceleration levels are low, such as near the edge of a galaxy, but it does not explain gravitational lensing, a phenomenon explained by general relativity. Since then, several attempts have been made to bring general relativity into MOND, such as tensor-vector-scalar gravity (TeVeS) and scalar-tensor-vector gravity (MOG). If you are unfamiliar with scalars, vectors and tensors, they are explored in detail in the Gauge Theory article. It makes sense to couch MOND in some kind of mathematical metric because general relativity describes gravity geometrically as a curvature in a space-time metric. Both models introduce modifications to gravity that lead to extra degrees of freedom that play the role of dark matter.

While Newton's gravitational laws can be written as equations in vector form and can describe the gravitational field as a vector field, Einstein's general relativity is a metric tensor where a tensor field is added into Newton's gravitational dynamics. This metric tensor is a four-dimensional differentiable manifold called a Lorentzian manifold.

In these models, baryonic matter is treated as a perfect pressure-less fluid with an energy-momentum tensor. Perturbations introduced by additional vector and/or scalar fields can affect the energy-momentum tensor value, enhancing the impact of baryonic matter.

TeVeS introduces two extra fields to this manifold, a scalar field and a time-like vector field. At a background level, these fields modify the overall dynamics of gravity but they don't change the overall energy density of space-time. However, when a space-like perturbation is introduced, energy density is affected. Perturbations in the scalar field have negligible effects, but in the vector field they lead to growth. In other words, a vector field growing in space feeds into Einstein's equations and enhances both the gravitational potential and baryon density, an effect that mimics the effect of pressure-less dark matter. There are criticisms of this model, however. For example, a star operating under TeVeS gravity would be far too unstable to last billions of years without exploding. As well, some physicists challenge whether TeVeS can account for both galaxy rotation velocities and gravitational lensing. The latter problem can be solved by introducing a sterile neutrino with a mass of 11 eV. A hypothetical sterile neutrino possesses right-handed chirality. So far only left-handed neutrinos have been observed but right-handed ones are possible in the Standard Model, and all other fermions possess both kinds of chirality. The author of this referenced paper also discusses various ways that the validity of TeVeS could be tested.

MOG (modified gravity) developed by physicist John Moffat here in Canada, works differently. (I recommend checking this link on Wikipedia. His life story is an interesting one).

MOG introduces a very large tensor field that acts as a repulsive gravitational force that cancels the effect of gravity at smaller scales. In other words, it assumes the gravitational force is much higher than what we measure, but at scales starting at galaxy size and going smaller, its effects are increasingly diminished by the repulsive force introduced by the extra tensor field. This model introduces three scalar fields - the mass of and the strength of the introduced vector field are treated as scalar fields along with Newton's gravitational constant, which is preserved as a scalar field - into the space-time tensor metric that describes the dynamics of general relativity. That is why it's also called scalar-tensor-vector gravity. It describes all the observable effects of dark matter, including the CMB spectrum peak data. At scales smaller than galaxies (smaller than a few million solar masses), most of the gravitational force is canceled out by the repulsive force, predicting gravitational effects that coincide with general relativity. An excellent physics blog called Spinor Info, operated by Viktor T. Toth, discusses MOG and provides links to several current scientific papers that deal with it. I find it curious that this theory seems to have gotten little attention in the media, as its predictive power is very good and there are few observational inconsistencies with it. One way in which MOG's validity will shortly be tested is by refining the accuracy of the angular power CMB spectrum data. If you scroll back to that graph earlier in this article, you will notice there is a fair amount of uncertainty (the lengths of the multicoloured vertical bars) in these measurements particularly when you look at the third and later peaks. There is significant room for refinement there. Recall that dark matter is explained as a dampening of baryonic oscillations. That explanation is based on the standard cosmological model or Lambda cold dark matter theory. In MOG, these oscillations are explained by deepening the gravitational wells themselves. The current resolution of the data isn't good enough to prove which theory fits best, so neither is ruled out. As the resolution of galactic surveys improves, one or the other should win out.

Conclusion

The acid test for dark matter will be whether or not a cold dark matter candidate particle will be detected. The blueprint for a new particle physics project called the International Linear Collider was published in June this year (2013). Its main goal will be to hunt for dark matter particles. As the Large Hadron Collider finishes its current upgrade it too will be more than powerful enough to detect dark matter particles such as sneutrinos or neutralinos. If one of these particles is discovered, supersymmetry will also be right at the top of the headlines. If no candidate particles are found, research into gravity itself will likely be the new focus, with MOG being especially interesting to follow. The next few years in physics will be fascinating either way!

Sunday, October 13, 2013

Supersymmetry

I recommend reading through the previous article, Gauge Theory, first to better understand this theory.

The Standard Model (SM) of particle physics is a theoretical framework of how particles of matter and energy interact, making the universe what it is. It is tremendously successful. It predicted the existence of W and Z bosons, the gluon, and top and charm quarks before any of these particles were ever detected in colliders. It also predicted the decay of the Z boson before it was confirmed at the Large Positron-Electron Collider at CERN. The Standard Model is not complete, however, because it does not describe the fundamental force of gravity nor does it describe dark energy or dark matter, both of which have been experimentally verified. It also suffers from an inherent problem called the hierarchy problem, which has to do with its inability to describe gravity in the language of quantum mechanics. Supersymmetry was proposed as a solution to this troubling problem, and it caught on, offering not only a hierarchy solution but also offering up some particle candidates for all the dark matter in the universe. However, supersymmetry stands in a no man's land where there is no shred of evidence to support it as a reality.

The previous article, Gauge Theory, goes into some detail about how the Standard Model was developed and describes some of the fundamental theories (such as gauge symmetry) that go into it. In this article, we go beyond the Standard Model by exploring the theory of supersymmetry.

Supersymmetry effectively doubles the number of the particles of the Standard Model, shown below. The question is why would we want to complicate matters? First, we should review what fundamental particles are.

What's a Particle?


Most of us recognize the familiar electron and perhaps the quarks that make up protons and neutrons in the nucleus of an atom. However, there are five other kinds of lepton particles here as well as force-carrying boson particles that we don't "see" in everyday life. The neutrinos are there all around us, streaming through us and through Earth as part of the Sun's solar wind but they are notoriously difficult to detect. They hardly ever interact with other matter particles - quarks and electrons - and they don't interact with the electromagnetic force. We can detect and measure the forces carried out by the gluon, photon and the W and Z bosons but they generally live as real particles only at very high energies. At everyday energies, physicists describe the force-carrying bosons as virtual particles. A virtual particle is a momentary excitation in a force field. These kinds of excitations are described by the perturbation theory in quantum field theory. In the last article, Gauge Theory, I described what a quantum field theory is and how it works as one of several theories embedded in the Standard Model. As an example of how virtual particles work, I'll take W and Z bosons, particles with a rest mass, as an example. These particles do not carry this mass as virtual particles, but they still conserve energy and momentum. The weak force interacts with all matter particles through the exchange of virtual W and Z bosons. However, "real" W and Z bosons, with mass (and energy) can be detected in the high-energy environment inside a collider. The photon is bit unusual because it exists as both a real particle and a virtual particle at everyday energy. For example, photons stream as "real" particles through air and the vacuum of space in the form of electromagnetic radiation such as radio waves. However, photons also exist at the radio antenna as virtual particles. Here, they are responsible for near field phenomena.

The longer a virtual particle exists the more it resembles a real particle, but virtual particles and real particles as separate entities is misleading. "Real particles" in quantum field theory, which underlies the Standard Model, are described just the same as virtual particles - as excitations (with various degrees of freedom) in underlying quantum fields. This is the mathematically consistent way to think of all particles.

Some particles live only in a very high-energy environment such as a particle collider, a cosmic ray interaction, or during the first few microseconds of the universe. These particles are unstable at everyday energy. They decay, often almost instantly, into smaller mass stable particles, often through complex chains of intermediate particles and even virtual particles along the way. These kinds of particle soups are pay dirt to physicists.

If you look again at the particle diagram above, you'll notice each particle's mass in the upper left corner in MeV/c2. This unit for mass (million electronvolts over the speed of light squared) hints at mass-energy equivalence, recalling Einstein's famous E = mc2. It also hints at how virtual and real particles can be equivalent mathematically. You will often see mass simply written as MeV and it is simply a convenient but technically incorrect shorthand. An eV (electron volt) is a unit of energy; an eV/c is a unit of momentum and an eV/c2 is a unit of mass. Generally, the more massive a particle is, the more energy it contains, but even the energy of massless particles such as photons can differ greatly. For example, a photon of green light has about 2.3 eV energy. X-ray photons have about 0.2 MeV (200,000 eV) of energy. A cosmic ray photon can have up to 1000 TeV (that's a whopping million million eV) of energy. Put a slightly different way than the statement above, particle energy often translates into particle mass, and the proton inside an atom's nucleus is a striking example of this. Up and down quarks, the quarks in ordinary matter have masses of just a few MeV/c2 each. However, a proton, made of three quarks, has a whopping mass of 938 MeV/c2. Why all this mass? A proton contains not only quarks but also gluons binding them together. As you can see, a gluon is a massless particle. However, gluons possess tremendous energy (it is quantum chromodynamics binding energy) and this energy really bumps up the proton's rest mass (again illustrating mass-energy equivalence).

The other four quarks - charm, strange, top and bottom - are much more massive (some are on the order of a billion eV each) than the up and down quarks inside atoms. In the Standard Model, elementary particles are organized into three families called generations, which differ by quantum number (flavour in the case of quarks and neutrinos) and mass, but their interactions are identical (top and bottom quarks still act like quarks for example).

Charm and strange quarks represent the second generation of quarks while top and bottom quarks represent the third generation. In a similar way, the muon is a second-generation lepton and a tau is a third generation lepton. We don't see second and third generation particles at everyday energy, however, because as soon as these higher mass particles are created they spontaneously decay into stable first generation up and down quarks and electrons, and they do so far too quickly to interact and combine into any "higher generation" protons, neutrons or atoms.  Even when the universe was very young there was far too much energy for any of these higher generation particles to settle into atoms.

Neutrinos are an interesting exception to the generation rule. All three generations of neutrinos, matter particles of negligible mass that travel at the speed of light, are stable and they stream throughout the universe, spontaneously oscillating between all three flavours (generations).

I drew the graph below to get an idea of the range of SM particle masses. The differences in mass are so great I had to use a logarithmic scale. If we multiply mass by the speed of light squared, we get the energy equivalent of each mass. That is what we will mostly be looking at, as well as at energies around the TeV (1012 eV) level, when we explore supersymmetry particles. As you can see, none of these Standard Model particles quite approaches that energy/mass. The massive W and Z bosons are mapped out but the photon and gluon, both massless, are not shown, but they do have momentum and energy, as I mentioned. Observed photon energy ranges from about 10-15 eV (ultralow frequency radio waves) to 1021 eV (cosmic rays) but in theory a photon's wavelength can approach the infinitesimally small Planck length, which corresponds to an almost infinite energy. Gluon energy (taken as a normalized density) inside a baryon is in the 106 eV range.


No one knows why particles of matter come in three generations, and not one or two or four. The existence of fourth and higher generations of even more massive particles, though theoretically possible in the Standard Model, are considered to be very unlikely, based on experimental evidence from the LHC and the Tevatron colliders.

Do the force-carrying particles, the bosons, come in different generations? An interesting example is the Higgs boson. This particle comes into existence as the electroweak force breaks giving rise to the Higgs field, which imparts mass on particles. The Higgs boson isn't a force-carrying boson like the photon, gluon or the W and Z bosons because the Higgs field, though a gauge field like the electromagnetic, strong and weak forces, is a scalar field rather than a tensor field. It doesn't transfer energy and it can't accelerate particles. However, just like the gauge bosons, the Higgs boson is a quantum excitation (or perturbation) in a quantum field. Is there one Higgs boson or are there three, one corresponding to each generation of leptons and quarks? No one knows.

A similar question has been asked of the W and Z bosons of the weak force. Much heavier theoretical X and Y bosons could carry out the grand unified force (GUT). This fundamental force would consist of the electroweak and strong force combined into one force that would operate only at extremely high energy. As the universe cools, it breaks into the strong force and electroweak force. The EM force and the weak force at this point are still combined under yet another symmetry that will break when the universe cools further. This symmetry-breaking is mentioned in the previous article about Gauge Theory. X and Y bosons would be much more massive cousins of the W and Z bosons of the weak force, but whether these hypothetical bosons, or the Higgs boson, could be considered multi-generation particles is an open question.

Why leptons and quarks specifically come in three generations has led researchers to look for some connection between these two kinds of matter particle. Several researchers have been searching for hybrid particles called leptoquarks. These particles, part of several theories such as the GUT (SU(5) gauge symmetry theory, mentioned in the previous Gauge Theory article, come in three generations corresponding to those of the quarks and leptons, and they may carry information between quarks and leptons, allowing quarks and leptons to interact. Such an SU(5) symmetry particle is predicted to be astoundingly heavy, as massive as an atom of lead, and it would not feel the strong force as a separate force. This more fundamental particle would describe an even higher SU(5) symmetry. It would decay almost instantaneously, perhaps into a same-generation quark and lepton. Mass limits for these three generations of leptoquarks are not worked out and no potential leptoquarks have been detected.

The Standard Model diagram above does not include all known elementary particles. Each particle, both fermion (matter) and boson (force), in the Standard Model diagram has its own antiparticle, it's own antimatter twin in other words, owing to another kind of symmetry in the Standard Model, called charge-parity or CP symmetry. We call it antimatter but anti-force particles are included too. W+ and W- bosons are antiparticles of each other, and the Z boson, being neutrally charged, is its own antiparticle, much like the photon. Gluons are more complicated because they have colour charge. There are eight gluons based on colour charge and they transform under real representation. This is the mathematical way to saying gluons always exist in a kind of superimposed state, so you have to consider the gluons as an octet and this octet has its own antiparticle.

Physicist Paul Dirac predicted the existence of the positron, the antiparticle of the electron in the 1930's. Each particle's antiparticle twin has the same mass and lives at the same energy, so a positron is as stable as an electron and an anti charm quark is just as massive and unstable as a charm quark, for example. The difference is that the charge is reversed, so a positron is positively charged and an anti charm quark has an anti colour charge. Before supersymmetry ever came along, Dirac doubled the number of particles in the universe by using symmetry.

Below, hydrogen and antihydrogen atoms are compared. The "mirror" is CP symmetry. Antihydrogen, the simplest possible antimatter atom, has been produced in the lab. It is expected to have many if not all the same physical properties as hydrogen, even the same colour of glow when excited anti-atoms return to ground state.


Why don't we see antimatter planets and stars then? Every particle and its antiparticle immediately annihilate each other upon contact. The two atoms above would immediately explode into a mixture of gamma photons, neutrinos, electrons and positrons, so antihydrogen is held in a magnetic vacuum trap. This isn't a perfect symmetry; if it were, all matter and antimatter likely would have been annihilated long ago, and the universe would be empty of all matter. Instead, thanks to a mechanism called CP violation (a slightly imperfect symmetry favouring matter), a small excess of matter survived in the universe. Antimatter particles are produced naturally in beta decay and in cosmic rays as well as in colliders. Some neutral particles, such as photons, are their own antiparticle, and some particles are made up of both matter and antimatter. These are mesons. Unlike three-quark hadrons such as protons and neutrons in atomic nuclei, mesons are made up of two quarks, one of matter and another one of antimatter, a quark and antiquark held together by the strong force. There are many different kinds of mesons, depending on the type of quark and antiquark involved as well as their overall spin configuration. These quark pairs are all very unstable and they annihilate just as you would expect them to, but two quarks of different flavours decay a bit more slowly because they decay via weak interactions, changing flavour, first, suggesting that angular momentum (they spin around each other) keeps them around for a tiny moment before they annihilate.

As we've seen, the universe is already quite well stocked with various stable and unstable particles of force, anti-force, matter and antimatter. Why complicate it further?

Why Supersymmetry?

Supersymmetry, if it exists as a reality, doubles the number of particles in the Standard Model yet again. This operation is built on a symmetry between matter and force that would be apparent (unbroken) at very high energy. This unbroken supersymmetry space is called superspace. Superspace is impossible to visualize so there is no way around describing it mathematically. Hopefully what follows will give you a feel for this concept. Along with ordinary spatial dimensions, superspace also contains an equal number of dimensions that anticommute. A commutative operation is like putting on your socks - it doesn't matter what order you put them on, the end result is the same. A non-commutative operation is like putting on underwear and jeans, the order matters (hopefully). Anticommutative lives just in mathematics. Here, swapping the order of two functions negates the result altogether. Put according to Wolfram Mathworld, an operator * for which a * b = - b * a, is said to be anticommutative. Lie algebra, which underlies gauge theory, is an example of an anticommutative algebra.

Ordinary space dimensions correspond to bosonic degrees of freedom (can be swapped out) and anticommuting dimensions belong to fermionic degrees of freedom (can't be swapped out). You can get a hint of Pauli's exclusion principle here. You can describe superspace as a supermanifold that contains bosonic and fermionic coordinates, but these coordinates are not really sets of points but instead "dual points of view." When you transform the manifold in space-time you mix bosons and fermions. Okay, you can breathe again.

Each matter particle has a force superpartner and each force (boson - spin 0 or 1 or 2 in the case of the hypothetical graviton not included below) particle has a matter (fermion - spin 1/2) superpartner.



Supersymmetry or SUSY for short, was introduced in the1960's and developed in large part in the 1970's and 1980's.

It offers, in return for the complication of extra particles, a possible explanation for dark matter because any of its particles could be candidates for this mysterious mass that accounts for five times more matter than the ordinary matter physicists can account for. The luminous matter in the universe, stuff that can be detected such as gas, dust, planets and stars, doesn't explain the much higher than expected gravitational behaviour of the universe, such as too-high orbital rates of large-scale structures like galaxies. Particles such as neutrinos as well as non-radiating black holes, planets and brown dwarfs (all non-luminous) have been ruled out as insufficient in mass to account for it. The rest of the matter - undetectable by any currently known means - is called dark matter.

SUSY also offers an explanation for why the powers of the four fundamental forces differ so wildly from each other. In particular, the weak force is 10 quadrillion times more powerful than gravity. Why? This is the hierarchy problem, and it is pretty huge because ultimately by solving this problem SUSY offers a possible way to bridge the looming gap between quantum mechanics and general relativity, paving the way for an ultimate theory of all fundamental forces and particles - a theory of everything (TOE), a tall order to deliver! The Standard Model is based on three fundamental constants in nature - the gravitational constant, the speed of light and Planck's constant - to get fundamental units for length, time and mass. The problem comes in when using these constants to get a base unit for mass (Planck mass). It is many magnitudes too huge. It's far more massive than any of particles discovered so far. By introducing an extra symmetry to the Standard Model, this Planck-scale mass is cancelled out, bringing predicted masses for particles into line with what physicists measure in colliders. I will explore the hierarchy problem in more detail later on.

Third, supersymmetry offers help for something called gauge-coupling unification in the Standard Model. These three solutions will be explored in detail under "What Supersymmetry Does," later on.

In general, what SUSY does mathematically is that it reins in seemingly disparate forces and particles into a fairly neat and tidy higher symmetry, and symmetry, as we saw in Gauge Theory, seems to lie at the heart of particle physics.

How Supersymmetry Came To Be

Hironari Miyazawa first glimpsed the possibility of supersymmetry in 1966, when he tried unsuccessfully to find a relationship between baryons and mesons that involved an internal symmetry between the two groups of hadrons. Hadrons are quark-based composite particles. They come in baryon and meson form. Baryons, made up of three quarks, include the proton and the neutron. Mesons, which we explored a little earlier, are each made up of a quark and an antiquark, and unlike baryons, they are all extremely unstable.

In the1970's, supersymmetry was rediscovered as a new kind of structure that unifies all the fields in quantum field theory with space-time. In doing so, it also establishes a relationship between the different quantum natures of fermions and bosons. Fermions are all particles with a 1/2 quantum spin. These include electrons and other leptons as well as all quarks and any composite particle that contains an odd number of these particles, such as baryons. These particles, usually a part of matter, can only occupy one quantum state at a time thanks to the Pauli exclusion principle. Mathematically, they all obey the rules of Fermi-Dirac statistics. Bosons don't need to follow this rule. Any number of bosonic particles can all occupy the same quantum state at the same time. These particles have whole integer quantum spins and obey Bose-Einstein statistics. All the force carrier particles such as photons, W and Z bosons and gluons, as well as mesons (two half integer spins add up to an overall quantum spin of 1), are bosons, but even fermions of matter may display bosonic behaviour.  For example, some materials, when cooled below a critical temperature transition into a state of superconductivity. Within these materials, electrons (that are not necessarily close together) form pairs called Cooper pairs, which act like bosons (two half integer spins add once again into a spin-1 "particle"). In a similar way, atomic nuclei of even mass number such as helium-4 can become (bosonic) superfluids. The mathematics of this rediscovered supersymmetry is based on a consistent Lie superalgebra structure, where the even elements of the algebra correspond to bosons and the odd elements correspond to fermions. This algebra already has "proof" from nature. It has been used successfully to model observed resonances inside atomic nuclei by relating bosonic and fermionic nuclear properties with each other.

This type of mathematics also closely links supersymmetry with string theory (more on this later on). Though discovered outside of string theory, Lie superalgebra was incorporated into string theory in the 1970's transforming it into superstring theory.

What Supersymmetry Is

Supersymmetry is a broken symmetry, at least at everyday energy. If it were an intact symmetry at "our" energy, particles and their superpartners would have the same mass and they would share the same interactions. An atom, for example would contain both electrons and selectrons (both having the same charge but one being fermionic and the other bosonic). Physics would be unrecognizable. However, when the universe was very young and very energetic, somewhere around the TeV scale of energy, supersymmetry was intact and both standard and SUSY particles would have existed. As the universe cooled, supersymmetry broke and SUSY particles decayed, leaving behind just one stable "fossil" particle, the lightest of the four neutralinos. The mechanism responsible for supersymmetry-breaking is not entirely worked out. There are currently three main theories, called gravity-mediated, gauge-mediated and anomaly-mediated SUSY-breaking.

Most symmetries in physics, such as the gauge symmetries mentioned in Gauge Theory, are created by fields that transform under tensor representations of space-time. Supersymmetry, on the other hand, is created by a transformation of space and time under a spinor representation. Remember the anticommuting coordinates I mentioned earlier? These are supersymmetry coordinates in ordinary space-time, and they come in the form of spinors. A spinor is unlike the spatial vectors and tensors we explored in Gauge Theory. Here, a 360-degree rotation transforms the numeric coordinates of a spinor* into their negatives, so it takes a 720-degree rotation to get back to the original coordinates. We have encountered a spinor before. In fact, spinors are incorporated into quantum mechanics to describe the 1/2-spins of all fermions. The quantum spin (which is not a physical rotation in space but is the intrinsic angular momentum) of the electron, for example, is a spinor in three dimensions. The spinor is essential to spin statistics theorem in quantum mechanics, Fermi-Dirac statistics and Bose-Einstein statistics, which say that bosonic fields commute and fermionic fields anticommute. This is the mathematical way of saying that two or more bosons can have the same quantum numbers but fermions can't. Combining the two fields gives you a single algebra, which along with a mathematical procedure called z2-grading to give even boson elements and odd fermion elements, is called a Lie superalgebra.

Lie algebras and Lie groups, introduced in the Gauge Theory article, describe gauge and other symmetries in the Standard Model, but they cannot introduce the kind of inversion that happens when fermions are swapped for bosons and vice versa. If the Lie algebra is graded, it becomes superalgebra, however, and it can describe these kinds of swaps.  When this algebra is incorporated into the Standard Model, the number of particles is doubled. The simplest possible model based on this superalgebra is the Minimal Supersymmetric Standard Model (MSSM).

What Supersymmetry Does

For every particle there is a corresponding superpartner particle of higher mass. These higher masses indicate that supersymmetry is not an exact symmetry. Physicists knew even before starting with supersymmetry that it could not be exact because superparticles and regular SM particles would all have equal masses. The mathematics of supersymmetry, which solves the hierarchy problem, also puts the mass range for superparticles at the TeV scale, well above even top quark mass. For example, an electron has a superpartner selectron and a quark has a superpartner squark. Matter particles are given an "s" in front and force particles are given an "ino" at the end. Squarks and selectrons are bosons whereas quarks and electrons are fermions. Supersymmetry is expected to be intact down to around the TeV (1012 eV) scale of energy. This is about ten times the energy/mass of the top quark, which is believed to be about 173 GeV/c2, and the Higgs boson at about 125 GeV. This is the energy at which supersymmetry solves the hierarchy problem. Both the Large hadron Collider (LHC) and the Tevatron collider can achieve this energy. It is the energy at which physicists hope to find some of the lightest supersymmetric particles, such as neutralinos.

Supersymmetric particles are expected to be unstable with the exception of the lightest of four fermionic neutralino particles. The lightest neutralino is expected to be light enough to find in a high-energy collider.

Like neutrinos these particles should interact only with the weak force and gravity, making them almost impossible to detect directly. However, when protons are smashed together (at the LHC) or protons are smashed with antiprotons (at the Tevatron collider), two other SUSY particles, squarks and gluinos, should be created at the energies produced by these collisions. These would decay before any possible signature is observable but their decay chains should leave behind one stable neutralino each, which leaves the detector unseen. Physicists know the total mass/energy before the collision, which is conserved, so all the mass/energy subtracted from all the decay products should come up just short. The diagram below shows two possible decays - gluino decay and squark decay, each of which produces one lightest (N1) neutralino.


In this way, physicists looking for the neutralino hope to find its missing energy signal. So far, even at close to 2 TeV energy, no missing mass has been detected. This should make any SUSY physicist feel a bit jittery, but this search is still in its early phase.

1) Dark Matter: Sneutrinos Versus Neutralinos

The boson superpartner of the neutrino is the sneutrino, so there should be sneutrinos, smuon sneutrinos and stau sneutrinos. These are all spin-0 superpartners of their fermion neutrinos, and they should be higher but fairly low mass because the neutrino itself is thought to have an infinitesimally small mass. A mixture of these three bosons is not a dark matter candidate because it would have been detected by now. The sneutrinos interact via Z boson exchange and this would have been detected. On the other hand, right-handed or sterile sneutrinos are possible. Having right-handed chirality, these particles would not interact with the Z-boson as strongly and may exist undetected.

More commonly, another superparticle, the neutralino, is mentioned in the physics literature as a dark matter candidate. This particle is a fermion like the neutrino and it is not a superpartner of it. The theory behind this particle is a little more involved. According to supersymmetry, each of the quantum fields for the photon, W and Z bosons, and the Higgs boson should include another quantum field representing a massive fermion - the photino (or bino), wino, zino (I know, they're terrible!) and Higgsinos, respectively. These were shown in the superparticle boson image earlier.

The bino and zino are neutral just like their SM particles. The winos are charged like the two W bosons. The Higgsinos are a weak isodoublet superpartner of the Higgs boson, with one part being charged and the other part being neutral.

The zino, bino and the neutral Higgsino all have the same quantum numbers (zero charge and quantum spin 1/2) so they can mix to form four mass eigenstates of the mass operator called "neutralino." Why four and not three? After electroweak symmetry breaking, the Higgsino becomes a pair of neutralinos and a single chargino.

A mass eigenstate is a free-particle solution to the wave equation of a particle or particles. An example of a mass eigenstate is an oscillating neutrino, which has electron, muon and tau oscillations, each of which should have a slightly different mass, but the mass eigenstate gives you a narrow range of possible mass bound by the heaviest tau component (0.5 eV, less than a millionth of an electron's mass) and the lightest electron component which could as low as zero-mass). There is no way to get exact values for the individual masses of the electron, muon and tau neutrino, because in nature the neutrino exists as an indeterminate quantum superposition of these three flavours.

In a similar operation, to get the neutralino mass eigenstate through electroweak symmetry breaking, physicists must take linear superpositions (a mixture) of the bino, wino, zino and Higgsinos.

I think of it this way: You put the wino, bino, zino and the Higgsinos into an "electroweak-breaking blender" and turn it on. After "blending," during which the electromagnetic force and the weak force emerge, you get four massive neutral fermions (neutralinos) and two charged massive fermions called charginos (X+ and X-). One chargino comes from the charged component of the Higgsino and the other one comes from the charged wino. The four neutralinos are four different mass eigenstates. These masses are most likely to be bino-like, wino-like, and Higgsino-like, but the mathematics allows the mixing to vary. The bino (photino)-like mass eignestate should be the lightest and the only stable neutralino.

This neutralino, called N1, is the particle of most interest to dark matter theorists, because it, being the lightest possible superparticle (LSP), shouldn't be able to decay into anything else. In our present low-energy universe, it can be thought of as a stable fossil of supersymmetry.
All heavier (all unstable) supersymmetric particles should ultimately decay into the N1 as well. A neutralino should have a mass in the range of 1 Tev and it should be able to interact with other particles through the weak force, much like a neutrino does. In many models the LSP can be created in the very young energetic universe and it, along with the lightest stable products of supersymmetric decay, can leave about the right amount of mass to fit the gravitational effects of dark matter. An answer neatly packaged and wrapped in a bow.

2) The Hierarchy Problem: A Possible Solution

The hierarchy problem in particle physics asks why the weak force is so much stronger, 1032 times, than gravity. This is a technical problem that is challenging to understand and it can be described in many different ways. In a nutshell, the hierarchy problem means that the real-life measurement of the power of the weak force doesn't agree with what the Standard Model calculation says - it's way too big. In order to make this calculation agree with experimental evidence, physicists must fine-tune (make quantum corrections to) the formula, without a known mechanism to explain why they have to do it, not a great place to be in terms of good theory. It's a fairly gangly part in an otherwise elegant model. The minimal supersymmetric theory (MSSM) was first proposed in 1981 to solve the hierarchy problem by stabilizing the weak scale of energy and this means stabilizing the Higgs boson mass, which is unstable when quantum corrections  are done to it. The derivation of the mass of the Higgs boson's superpartner, the Higgsino, a fermion, lends its stability to the Higgs boson, and brings the weak force into line with experimental measurements.

One Way of Looking at the Hierarchy Problem

It starts with mass and the Planck units I mentioned earlier in this article. We've come across Planck units for length and time in past articles. Each of these units, described as smallest possible meaningful units, is incredibly tiny. Planck length is 10-35 m, about 1020 times smaller than an electron cloud, while Planck time is 10-44 seconds. Beyond these limits current physics no longer makes sense. Planck mass, however, just doesn't fit with these other units. It is huge, about 10-8 kg. That's about the mass of a flea, or 1023 times more massive than an electron. It is way out of whack with quantum-scale physics. The reason it's so big is that it is derived from Newtonian gravity and special relativity as well as quantum angular momentum. It takes the speed of light, the gravitational constant and the reduced Planck constant into its calculation. Planck mass is the mass of a Planck particle, which is a black hole whose radius is Planck length. Planck mass is thought to be an important number because at this mass, general relativity is as relevant to the physics of what's going on as quantum mechanics is - the two mutually exclusive theories try to meet up. We know that they must up meet somehow to explain what's happening to matter, space and time inside a black hole. Below Planck mass, only quantum mechanics describes the physics. Above Planck mass, only general relativity describes the physics.

In the 1970's, physicists realized that this problem has something to do with the Higgs field, which gives mass to particles. This is the field, carried out by the Higgs boson, which breaks the symmetry of the electroweak force. Unlike fermionic spinor fields and other bosonic gauge fields, the Higgs field is a scalar field, with the Higgs boson being a spin-0 scalar particle. This is the first scalar field discovered in nature. It differs from the vector and tensor fields of fermions and other bosons because it is independent of transformations on the field. The Higgs boson predicted by the Standard Model should have Planck mass and because it couples to all other particles of mass, they should have Planck-scale masses too. Obviously, we know that they don't. The mathematics in SUSY uses an extra symmetry that cancels all Planck-scale contributions to the mathematics and protects the Higgs field (and therefore mass) down to much lower realistic energies. Mathematically this is called Higgs quadratic mass renormalization. The quadratic divergence in the scalar self-energy of the Higgs field is cancelled.

Another Way of Looking at The Hierarchy Problem

When the universe was very young and very energetic, the Higgs field had an average (vacuum expectation value of zero. At this point, the W and Z bosons and the photon were all identical zero-mass particles. This field's energy jostled all over the place until the universe cooled to a certain temperature. According to quantum mechanics, even a perfect vacuum in space has a small but positive energy. This vacuum energy was very high when the universe was a tiny fraction of a second old, and it caused massive quantum fluctuations in the Higgs field, making it very unstable.

As the universe cooled, the Higgs field was locked into a particular non-zero positive energy of somewhere above 125 GeV, in a process that is a lot like a phase change such as water freezing into ice of a particular orientation or magnetic domains locking in place within cooling lava. This energy is calculated using the electroweak (SU(2) gauge theory part of the Standard Model mathematics. This value in turn sets the scale for all masses in the Standard Model. It gives the mass of the W boson at around 80 GeV, for example, and that mass has been experimentally verified in a collider. At very high energies, however, such as when the universe just came into existence and the electroweak force was still intact, the Standard Model no longer works. Here, there must be some modification to the Standard Model in order to describe physics at energies where gravity becomes as important as quantum mechanics (this is why the inside of a black hole is still a black box so to speak) and this is expected to happen at around Planck mass, where the corresponding energy is around 1019 GeV. At this energy, the Standard Model says the Higgs scalar boson itself should be Planck mass and we know that it isn't. It's at these high energies where the hierarchy problem with the Higgs scalar field becomes obvious. Below electroweak energy, there is no problem but at higher energy something must be in place to cancel out quadratic divergences (that's where this huge mass comes from) in the mathematics of quantum mechanics. The mathematics of supersymmetry does just that. To quote Oxford University's online article on SUSY, " - a boson-fermion symmetry gives the scalar masses 'protection' from quadratically divergent loop corrections by virtue of being related by symmetry to the fermion masses, which [themselves] are protected by chiral symmetry."

What is interesting here is that even at everyday energy, there is still significant vacuum energy fluctuation. This implies that the Higgs field in the our current universe is not entirely stable. The quantum mechanics in the Standard Model, however, say that the universe should be at its lowest energy state possible, in a minimum potential energy well in other words. The shape of this potential energy well can be determined by the scalar Higgs field. It means that the universe is at a local minimum energy, called a false vacuum, but not at its lowest possible energy (true vacuum). The graph below describes the energy of the Higgs scalar field Φ in a false vacuum. The universe exists where the dot is located, in a long-lived but slowly eroding metastable state. The universe's energy E is higher than in the true vacuum or ground state to the left of it.

User:Stannered;Wikipedia
Even though there may be a barrier between the two vacuum states, it could be very gradually eroded away by the effects of quantum tunnelling. The Higgs boson mass (125 GeV) may be very close to the boundary for stability. If the universe eventually falls into a true vacuum state, it would spontaneously explode in the process, releasing vast amounts of energy (representing the potential energy difference) as it does so. Put this way, supersymmetry protects the Higgs field from instability brought on by vacuum fluctuation, because it introduces a superpartner, the Higgsino, a fermion. Fermion masses are radiatively stable unlike the scalar boson Higgs mass. Wikipedia puts it this way, "Supersymmetry removes the power-law divergences of the radiative corrections to the Higgs mass." This means that even at high energy, the Higgs mass stays the same. The Higgsino, through its interaction with the Higgs boson, stabilizes the Higgs mass. It also works as a candidate for dark matter as described earlier.

The hierarchy problem solution gives sypersymmetry great kudos but it does prove that supersymmetry is really a part of nature. There is also an alternative theory that stabilizes the Higgs field, called technicolor. Here, the Higgs boson is treated as an emergent particle of the strong (colour) force. The basic idea is that you repeat quantum chromodynamics (QCD) (this will hopefully be an article to come) at the Higgs electroweak energy scale. An additional stabilizing Higgs mechanism is produced by a new SU(6) gauge symmetry. There are still many technical problems with technicolor. This theory can accommodate the discovery of the Higgs boson, but it doesn't predict it.

3) Gauge Coupling Unification

The renormalization (fitting together) of the three gauge coupling constants - SU(1) the electromagnetic force photon, SU(2) the three weak force bosons, and SU(3) the eight strong force QCD gluons - isn't perfect in the Standard Model. It is sensitive to particle content and the present particle content of the model means that these three coupling constants don't quite meet up at a common energy scale, that of grand unification where the electromagnetic force and weak force (combined into the electroweak force) merge with the strong force at a particular energy. The incorporation of MSSM into the Standard Model allows the coupling constants to converge at one point, about 1016 GeV. Some physicists consider this feat to be indirect evidence for both MSSM and SUSY-based grand unification theories, but there are other (non-supersymmetry) mechanisms that can also do this job of convergence.

Where Does Supersymmetry Stand?

1) Where are the SUSY particles?

No supersymmetric particles have been discovered even by smashing protons apart at an energy approaching 8 TeV in the LHC. This energy is divided among various quarks and gluons that make up each proton, which means a new particle of at least 1 TeV could be produced in the mixture of broken proton parts. The lightest theoretical SUSY particles should be right in this range and not much higher. This limit is ironically thanks to the recent Higgs boson discovery (as I write this, physicists are receiving the Nobel prize in physics for it). The Standard Model puts the Higgs mass at the huge Planck mass but supersymmetry brings its predicted mass right down to around 125 GeV, right where it was discovered to be. All particles of mass must interact with the Higgs boson via the Higgs field, and this interaction puts an upper limit on the masses of all the superparticles. If none are found at this energy, supersymmetry, at least in its present simplest form, must not valid. Having to add corrections to the SUSY mechanism to account for a higher superparticle mass scale would seem to take the lustre off what was once its best quality - simplification of the Standard Model.

The LHC collider hasn't been running at this energy for long so there is still hope to see these elusive particles. It may also be possible to build entirely new theories, which still retain some SUSY components while accommodating slightly higher masses.

2) The Strange B-Meson: End of SUSY?

Even so, it is hard not to think of supersymmetry as a theory on life support, and recent collider strange B-meson data seems to spell its demise even more direly.

A collider is built like a pipe that accelerates particles such as protons to close to the speed of light, shooting them straight at each other. Built around the pipe, where the collision occurs, are massive sensitive detectors that look like super thick insulation. These detectors pick up a tremendous amount of data about where every single particle goes and there are many, many possible particles each with momentum and a trajectory. It requires an enormous amount of computer processing to recreate a single collision. To make matters more challenging, some collision results are extremely rare occurrences, so millions of collisions may be required in order to get significant data. There is sometimes only a very small probability that a particular decay chain will occur and particles specific to that decay will reveal themselves. An example of this is strange B meson decay. The strange B meson, (predicted by the Standard Model) is of great interest to physicists. It undergoes spooky oscillations and its very occasional decay into two particles called muons gave researchers direct evidence for CP violation. When two protons collide, the strange B meson itself shows up only occasionally, and when it does, it only very rarely decays into two muon particles (μ + and μ -).

The Standard Model predicts it will decay into muon particles only once every 280 million times. What makes this occurrence so rare is that this decay depends on the fleeting appearance of certain virtual particles during the collision. The results of the LHC closely match this prediction, at once every 310 million times. Supersymmetry predicts a far higher rate, by a factor of about 100, because superparticles should also be present, influencing the decay. This delivers a blow to supersymmetry's credibility. But it may not be as fatal as some physicists claim because the error margin in the statistical agreement between the observed rate and the predicted SM rate is quite large (though acceptable), and also because in the most stripped down version of SUSY (with the fewest number of particle couplings), strange B meson - muon decay is suppressed enough to agree with experimental results. The LHC, the only collider with enough energy to produce strange B mesons, is now undergoing an upgrade so that it can run at up to 13 TeV. Hopefully then, strange B meson data and the presence or absence of the N1 neutralino will either prove or disprove SUSY. In the meantime, there is a real edge-of-the-seat feeling waiting for what kind physics is hiding around the corner.

Supersymmetry Hints at String Theory

Supersymmetry was incorporated into another particle theory called string theory in the 1970's. String theory at that time described only bosons. Incorporating the mathematics of supersymmetry allowed for the inclusion of fermions as well. Supersymmetry also performs a feat reminiscent of the hierarchy solution by canceling out certain terms and therefore simplifying string theory equations. Without supersymmetry, string theory contains awkward inconsistencies like infinite values and imaginary energy levels. Unlike supersymmetry, string theory by its nature operates up to the Planck scale, making particle "strings" far too tiny to directly detect. However, superparticles fit comfortably into string theory. Here, the breaking of supersymmetry (as well as other symmetries) can be modeled as higher-energy strings losing energy and transforming into strings with lower energy vibrations. If a superparticle is eventually detected, it would lend proof not only for supersymmetry but offer tremendous support for string theory as well. Perhaps string theory's best attribution is that it easily lends a theoretical framework for quantum gravity by combining quantum field theory with general relativity. Doing so introduces a massless spin-2 string called the graviton.

While the charm of string theory is obvious even to laypeople like myself, like supersymmetry, there needs to be some evidence to back it up, such as the discovery of a superparticle or some evidence for the extra dimensions that string theory specifies. Two mathematically beautiful theories - supersymmetry and string theory - are now at the mercy of what the LHC will find which it comes back online.

What is certain is that some kind of "new physics," physics that goes beyond the Standard Model,  MUST exist somehow, waiting in the wings.

*ASIDE: These concepts are not even close to easy for most of us. There is just no way to visualize a spinor, for example. When reading about gauge theory, I could sort of visualize how three-dimensional space is built up from a grid of three-dimensional vectors or even tensors, but not from spinors. I could also faintly grasp in my imagination how this grid transforms at least in three spatial dimensions, but there is no way to do that with a spinor grid or matrix. However, the mathematics describes precisely what's going on. The spinor transformation is just a very simple example of how mathematics takes our understanding to unfathomable places, places that are actually impossible in our everyday three-dimensional world. It's a little bit terrifying letting go of intuition and allowing math to take the reigns. I hope that you have felt this thrilling spark of wonder as well. To paraphrase Galileo's famous statement, if God has written the poetry of this universe, it must certainly be written in the language of math.