Friday 26 December 2008

It's All Decay

Contrary to what you might think from the title, this is not about Christmas. This is yet another PAMELA/ATIC related post. If it continues like that I'll have to rename the blog from Resonaances to Bumps.

By now you know the story too well: PAMELA and ATIC have observed an excess of cosmic-ray positrons that may are may not be a manifestation of TeV scale particles that constitute dark matter. Most of the subsequent theoretical activity was focused on explaining the signal via dark matter annihilation. However, there has also been a number of papers pursuing a different scenario in which the dark matter particle is unstable, and the excess positrons are produced while it decays. In fact, there are quite good reasons, both theoretical and phenomenological, to seriously consider this possibility.

On the theoretical side, there are some difficulties with the annihilation scenario. If the dark matter is a thermal relic, the inferred annihilation rate is some 100 times too small to explain the positron excess. In order to boost the annihilation rate today one needs some awkward modeling. One possibility is to assume that the dark matter distribution today is very inhomogeneous, and the average annihilation rate is boosted due to the existence of high-density regions. Another trick is to cook up new forces mediated by some mysterious 1 GeV particles. Of course, one could also drop the assumption of thermal equilibrium and invoke some non-thermal mechanism to explain the present dark matter abundance, in which case the annihilation rate can be high enough.

But who cares about theory? In the end, witty theorists can always find a way out. Indeed, the approach called I-will-fit-PAMELA-to-my-cherished-model-against-all-odds is very popular these days. However, there are purely phenomenological reasons to reserve some skepticism toward the annihilation scenario. That's what can be inferred from the recent paper by Bertone and al.

The observation is that production of 1 TeV electrons and positron inevitably leads to production of high-energy photons via the bremsstrahlung process. Thus, annihilation of dark matter should lead not only to a positron excess but also to a gamma-ray excess. Best limits on the cosmic gamma-ray flux are set by the Namibia-based Cherenkov telescope called HESS (as homage to Rudolf Hess [or maybe another Hess]) who covers the 100 GeV - 100 TeV energy range. It turns out that, assuming the annihilation hypothesis, the parameter space suggested by PAMELA and ATIC (the red region in the plot) is incompatible with HESS. A word of caution is in order here. The results of that analysis depend on the dark matter density profile for which we have only more or less educated guesses. The plot I included here assumes the most popular NFW profile, whereas the bounds are less severe if the density profile is less steep than NFW in the region close to the galactic center. For example, using the Einasto profile (which seems to be preferred by numerical simulations) the bounds are weaker and the PAMELA/ATIC region is only marginally excluded, while for the isothermal profile (less preferred by numerical simulations) the PAMELA/ATIC region is marginally allowed. It is fair to say, however, that there is a tension between the annihilation interpretation of the positron excess and the gamma-ray data. Moreover, observations in radio waves (which should be produced by the synchrotron radiation of the positrons) also seem to be incompatible with the annihilation scenario.


On the other hand, this tension disappears if the PAMELA/ATIC results are explained by an unstable dark matter particle with the life-time of order $10^{26}$ seconds and the mass of order 2 TeV. The plot, taken from this paper, shows that the PAMELA/ATIC region safely satisfies the HESS and radio bounds, even for the NFW profile. The simple reason is that the decay rate depends on the dark matter density as $\rho^1$, unlike the annihilation rate that depends on $\rho^2$. Thus, the growth of the decay rate toward the galactic center is effectively less steep.

Coming back to theory, a very long-lived unstable particle is by no means unusual (I mean in theory, in practice we haven't seen any so far). If the (meta-)stability of the dark matter particle is due to some accidental global symmetry, it is natural that this symmetry is broken at some high-scale. That is to say, the symmetry is broken by higher dimensional non-renormalizable operators suppressed by that high scale, and these operators could be responsible for the slow decay. It was pointed out here that the life-time of $10^{26}$ seconds and the mass of 1 TeV is compatible with dimension-six operators suppressed by the GUT scale. Which is inspiring... Note that, by exactly the same token, we expect that the global baryon symmetry is broken at the GUT scale and that protons are long-lived but eventually decay.

HESS has the potential to improve the bounds, or see the gamma-ray signal if it lurks behind the corner. Unfortunately, astrophysicists are more interested in astrophysical backgrounds. It seems that we need to wait for GLAST-now-FERMI to learn more. Unless...If you're a pirate off the Africa coast reading this blog here's the plan for you: 1) hijack the HESS crew, 2) force them to point the telescope into "nothing", 3) submit the results to ArXiv. The ransom will be paid in Stockholm.

Friday 19 December 2008

Christmas Play '09

This blog is no longer from CERN but it'll take time to shake off the nostalgia... End of year at CERN TH is traditionally marked by the Christmas play. This year's play is focused on the two events that recently sent a shudder through the planet: the credit meltdown at Wall Street and the LHC meltdown at CERN. Detective Holes (Ellis) investigates the connection between the two. Although the script falters at times, there are enough good gags to take you through 37 minutes of the play. A few must-sees include the Hawaiian nuts (Giddings and Mangano) hula-hopping, DG Cauchemar (Grojean) as Louis XIV/XV, and Evans the Accelerator (Lesgourgues) helium-squeaking. Also starring Fat Wall Street Bastard and Carla Bruni. For more inquisitive minds, there is something extra about the dimensions of Gia Dvali. Enjoy.







If it does not work, try downloading the movie from this page.

Monday 15 December 2008

Hitchhiker's Guide to Anomalies in Astroparticle Physics

I wrote recently a related post in which I collected anomalous experimental results in particle physics. That list was pathetic. I had to search deep to find a few results worth mentioning. None of those items could pass for a convincing argument for physics beyond the Standard Model. It is very likely that all of these anomalies are in fact unaccounted for systematic error, or bread-and-butter physics improperly understood.

The situation in astrophysics is completely different. Almost every respectable experiment can boast of an unexplained excess, a mysterious bump or a striking anomaly. Part of the reason is that, in astrophysics, backgrounds are often as good as unknown while the error bars are estimated by throwing dice. But, hopefully, this is not the whole story. In the end, the only clear evidence for physics beyond the Standard Model comes from astrophysical observations that have established the existence of dark matter. There is actually more dark than ordinary matter in the sky, and it is quite likely that some of the puzzling results below are in fact messages from the dark sector.

Here is a collection of astroparticle anomalies directly or indirectly related to dark matter searches. In my subjective order of relevance.

1. PAMELA

It finally happened: SUSY is no longer the favorite hottie, all eyes are now on PAMELA (although some attempt dating both). PAMELA is a satellite experiment who measures the cosmic flux of anti-protons and positrons. While the former flux is roughly consistent with theoretical estimates, the positron flux displays a steep rise at energies above 10 GeV, contrary to expectations based on the secondary production of positrons by cosmic rays scattering on interstellar matter. The simplest interpretation of the PAMELA excess is that the background is not properly estimated, and for the moment this remains a perfectly viable option. Another possibility is that the positron spectrum is contaminated by a nearby astrophysical source like a pulsar or a micro-quasar. Finally, the excess could be a manifestation of dark matter.
The PAMELA positron excess can be explained by a dark matter particle who is heavier than 100 GeV and annihilates preferentially into the Standard Model leptons, with the annihilation into hadrons suppressed down to the rate of 10 percent or less. A slightly more exotic scenario is that dark matter is not stable but decays into leptons, which amounts to pretty much the same from the point of view of indirect detection.

2. ATIC

If one naively continues the slope in the PAMELA spectrum beyond 100 GeV, the positron fraction above a few hundred GeV becomes of order one. That energy range is probed by ATIC - a balloon experiment detecting cosmic electrons and positrons (without being able to distinguish the two). And indeed, ATIC observes an excess of electrons at positrons at energies between 100 and 800 GeV. The size of the effect nicely fits with the PAMELA excess, and it is very likely that both observations have the common origin (whether it is dark matter or not). Moreover, ATIC observes a clear feature in the spectrum - a bump around 600 GeV followed by a sharp decline above 800 GeV (the latter recently confirmed by HESS). If this features are indeed signals of dark matter, the ATIC observation pinpoints the mass scale of the dark matter particle to be around 1 TeV. A good news for the LHC.

It may be worth mentioning that the ATIC peak is inconsistent with another experiment called EC who studied the similar energy range but found no excess. On the other hand, ATIC is consistent with the results from PPB-BETs, but that experiments is generally dismissed due to its miniature size (it was manufactured by Japanese). The rumor is that the new ATIC-4 data will confirm the peak and reduce the error bars by a factor of two.

3. HAZE

The WMAP satellite made its name studying the primordial microwave spectrum produced at the early hot stage of the Universe. The microwave emission from our galaxy is an annoying background (or foreground, depending which way you look) and has to be carefully studied too. Our galaxy pollutes the CMB via thermal dust emission, thermal bremsstrahlung, synchrotron radiation and spinning dust. Subtracting these known contributions revealed the presence of an additional component that extends some 30 degrees around the galactic center. This excess can be interpreted as the synchrotron radiation of electrons and positrons produced by dark matter in the galactic center (that's where the dark matter density is the largest). By itself, the Haze is maybe not an overwhelming evidence for dark matter, but in the light of PAMELA and ATIC it is another indication that too many positrons and/or electrons are flying around. Besides, it has a cool name.

4. EGRET

EGRET was a cosmic telescope that studied diffuse gamma-ray emission in the 30 MeV - 100 GeV range. Excessive emission from the galactic center at energies between 10 and 50 GeV was concluded in this paper. The excess can be interpreted as another manifestation of dark matter annihilation or decay that produces high-energy electrons and positrons. The latter produce high-energy photons via inverse-Compton scattering of starlight or of the microwave background.


5. INTEGRAL

The INTEGRAL satellite detected the 511 keV gamma-ray line from the galactic center. Photons carrying 511 keV energy arise from the e+e- annihilation at rest. If dark matter annihilation is the origin of this line, the dark particle must have rather non-trivial properties to produce electrons and positrons nearly at rest: either its mass is in the MeV range, or it has an excited state with a 1-2 MeV splitting. The most recent results from INTEGRAL weakened the case for dark matter. The new observations display an asymmetry of the emission with respect to the central axis of the galaxy which seems to be correlated with the distribution of low mass X-ray binaries - systems including a neutron star or a black hole that accretes matter from its companion. At this point a conventional astrophysical explanation seems far more likely, but the case is not closed yet.

6. DAMA

All the previous observations, when interpreted in terms of dark matter, fall into the class of indirect detection, that is observations of the final products of dark matter annihilation or decay. The complementary technique, called direct detection, consists in searching for signals of dark matter particles scattering on a target made of ordinary Standard Model particles. There are many direct detection experiments going on: CDMS, XENON, CRESST, DAMA to name a few active ones. The last one actually claims a detection. Unfortunately, this is the one we trust the least. The reason is that DAMA's detection technique cannot effectively distinguish dark matter particles from a huge background of ordinary particles scattering on the target. Instead, the claim is based on observing the annual variation of the signal which may be induced by a variation of the dark matter flux due to the motion of the Earth around the Sun. The size of the effect observed by DAMA is however in conflict with other direct detection experiments, unless the dark matter particle has some contrived properties (for example, an excited state with a 100 keV splitting). Another viable interpretation of the DAMA signal is that the Italian mafia dumps radioactive waste near Gran Sasso every year in June. Or there is some other regular effect with an annual period. Most likely, DAMA will share the fate of LSND: we will never what went wrong.

That's it. As a homework, try to fit all six anomalous results in a single theory of dark matter (solution here). But be careful. A wise man once said that if a theory can explain all experimental results then it is certainly wrong. Because some experiments are always wrong.

Wednesday 10 December 2008

ATIC ATAC

I'm slowly recovering from the shock of starting a new life in a new time zone in a new haircut. Time to kick off with no-longer-from-CERN blogging. As a warm-up, I have an overdue rant. Some inspiration came from Tommaso's post who shares a few warm remarks about theorists teaching experimentalists. Here I elaborate on the opposite case.

The Nature magazine recently published a paper from the ATIC collaboration. ATIC is a balloon-borne experiment that studies high energy electrons and positrons (they cannot distinguish the two) coming from the cosmos. Many of these electrons and positrons are created by known astrophysical processes, mainly by cosmic rays scattering on interstellar matter (the secondary production). Astrophysicists can roughly estimate the flux due to the secondary production, although these estimates are subject to many uncertainties and should be taken with a whole container of salt. Anyway, assuming that the background estimates are correct, ATIC observes an excess of electrons at positrons at energies between 100 and 800 GeV. This fits well together with the positron excess between 10 and 100 GeV reported recently by the PAMELA satellite. Things are even more interesting. Rather than a mere excess, ATIC sees a distinct feature in the spectrum: a bump between 300 and 800 GeV. Astronomers are excited because this could be a signature of an interesting astrophysical object (a young nearby pulsar?, a microquasar?). Particle physicists are even more excited because the PAMELA and ATIC observations could be the first clear signals of annihilating or decaying dark matter particles with TeV scale masses.

The results from ATIC may turn out an important piece in the puzzle of what is the nature of dark matter. However, the collaboration must consider their results so uninteresting that they have to provide us with a cavalier theoretical interpretation. The signal is interpreted in the context of the so-called LKP - the lightest Kaluza-Klein particle in the Universal Extra Dimensions (UED) scenario. The names of Kaluza and Klein appear in the abstract and $e^N$ times in the main text. In case you missed it, they stress in conclusion that "if the Kaluza–Klein annihilation explanation proves to be correct, this will necessitate a fuller investigation of such multi-dimensional spaces, with potentially important implications for our understanding of the Universe." All in all, ATIC claims to have found in their data some hints toward the presence of extra dimensions of spacetime. What is their reason for such an extraordinary claim? It looks like they applied the modern tertium non datur: whatever is not the MSSM must be the UED.

That's philosophy. Physics, on the other hand, does not support ATIC's interpretation. From the theoretical point of view one can complain that the UED is an artificial and poorly motivated construction, that it does not address any problems of the SM (except for dark matter) while creating new problems of its own, and so on. But the main point here is that the LKP interpretation is not consistent with all available experimental data. Firstly, the LKP annihilation cross section is not large enough to explain the ATIC and PAMELA signals (if the dark matter abundance is of thermal origin). ATIC shrugs this off with a bla bla, where the former bla stands for a 100 boost factor from dark matter clumpiness (which does not come out of numerical simulations) and the latter for "other kids do it too". There is yet another serious problem that has to do with the fact that PAMELA observes no excess in cosmic anti-protons. Even though the LKP couples more strongly to leptons than to quarks (because the former have larger hypercharges to which the LKP couples), the decay rate into hadrons in the UED is still far too large. This issue is not addressed at all in ATIC's paper because it was submitted before PAMELA's data came out.

To summarize, it's such a pity to mar a beautiful experiment with a crappy theory.

Tuesday 2 December 2008

Expelled

Today I left CERN. I can never stay long in paradise - I always get expelled for some ridiculous reason. This time it might be because
  • I lost hope that the LHC would be running soon
  • I spilled coffee on Witten the other day at CosmoCoffee
  • I discovered that John Ellis' beard is fake
  • My contract has expired
Whatever the reason, Luis (depicted above as an archangel) has told me to leave. This means I cannot run a blog from CERN anymore. From now on this is a no-longer-from-CERN blog. Reports from CERN seminars will give place general broodings about particle physics. However, CERN-gossiping should continue thanks to the spy network I established in the last years.

I suppose blogging will be perturbed until I settle in a new vacuum.

Tuesday 25 November 2008

LHC'09

Several interesting facts concerning the LHC are available on the slides from a recent talk of Jörg Wenninger. The talk describes in some detail the events between the glorious first beam on September 10 and the fatal accident on September 19. It explains what caused the accident, describes the steps already made to avoid similar problems in the future, and presents options for the repair schedule.

For those less interested in technical details, this page is the most relevant one:

In short, the operation will be restarted in late summer 2009 if they decide not to implement the full safety upgrade program. In that case, probably, both the energy and the luminosity will be smaller than previously assumed. On the other hand, if they decide for a full upgrade of the pressure relief system (which implies heating up all sectors) there will probably be no beam in 2009.

Thursday 20 November 2008

witten@cern

Edward Witten appeared here at CERN about the time of the first LHC beam. The coincidence suggests that he might have been created in a particle collision. That is however unlikely, since the entropy of one Witten is huge, even larger than that of a dragon (not to mention the fact that there were never any collisions at the LHC). In view of that, a more plausible explanation is that Edward is spending his sabbatical at CERN. My allusion to dragons was not entirely off-topic though, because Edward's presence seems to provoke awe and fear among the local string folk. I haven't yet quite discoverd why, but the story goes that around lunchtime each day they lock their offices, close the shutters, and hide in fireproof drawers.

Anyway, people out there don't want to know how Witten is doing, but what he is doing, so back to work... Last Tuesday Edward gave a talk entitled M2-Branes With Half Supersymmetry. The topic is far beyond my expertise and you should not expect any insight from me. I will try to summarize the main points, although it feels like reciting Bhagavad Gita in original.

Edward considers the 11-dimensional M-theory in the background of AdS4 x S7/$Z_n x Z_m$, which can be obtained as the near horizon geometry of a stack of M2 branes. This background preserves half of the original supersymmetry which corresponds to N=4 supersymmetry in 4D. The two discrete orbifold symmetries act on separate SO(4) components of the SO(8) symmetry group of S7. There are two orbifold fixed points who are $A_{n-1}$ and $A_{m-1}$ singularities on which $SU(n)$ and $SU(m)$ gauge theories live. Sitting on the Zn fixed point we see the SU(n) gauge theory in AdS4 x S3/$Z_m$ and, analogously, in the Zm fixed point we see the SU(m) gauge theory in AdS4 x S3/$Z_n$.

Edward argues that there are several interesting facts about this set-up:

  • The theory has a huge landscape of vacua that can be parametrized by elements x of SU(n) satisfying the condition $x^m = 1$ (because of the orbifolding) and elements y of SU(m) satisfying $y^n = 1$. There are $(\stackrel{n+m-1}{n}) \cdot (\stackrel{n+m-1}{m})$ such elements, so that the number of vacua grows factorially with n and m. It is surprising that so many vacua are encountered in a set-up with such a large amount of supersymmetry.
  • One can view the M2 branes as SU(m) instantons on $R_4/Z_n$ or, equivalently, as SU(n) instantons on $R_4/Z_m$. For some reason, the former point of view is called the Higgs branch, while the latter is called the Coulomb branch.
  • String theorists have their ways to count the number of instantons via D-brane configurations sitting at an orbifold point and the effective description in terms of quiver theories. Here, the quiver diagram for the Higgs branch contains the chain $SU(m) -> SU(m_0) x ... x SU(m_{n-1})$ and $U(p) -> U(p_0) x ... x U(p_{n-1})$ linked by bi-fundamental matter, where $p$ is the number of SU(m) instantons. Similarly, the Coulomb branch has the quiver with $SU(n) -> SU(n_0) x ... x SU(n_{m-1})$ and $U(\bar p) -> U(\bar p_0) x ... x U(\bar p_{n-1})$.
  • The integers m and n have a clear M-theory interpretation but the numbers of instantons p and $\bar p$ do not. But Gaiotto and Witten recently demonstrated the existence of a mirror symmetry that relates n and p, and also m and $\bar p$. This mirror symmetry allows one to describe both Higgs and Coulomb branches of M-theory.

This is it. I did not attempt to explain the physics but just to give a flavor of what Edward is brooding on these days. And don't ask me about the applications. God knows.

Sunday 16 November 2008

Hitchhiker's Guide to Ghosts and Spooks in Particle Physics

On Halloween this year the CDF collaboration at Fermilab's Tevatron announced the presence of ghosts in their detector. And not just one meager Poltergeist rattling his chain, but a whole hundred-thousand army. As for today, the ghosts could not be exorcised by systematical effects. While waiting for theorists to incorporate the ghosts into their favorite models of new physics it is good to know that the CDF anomaly is by no means the only puzzling experimental result in our field. There are other ghosts at large: I guess most of them are due to unknown systematical errors, but some may well be due to new physics. Below I pick up a few anomalous results in subjective order of relevance. The list is not exhaustive - you are welcome to complain about any missing item.

So, off we go. In this post I restrict to collider experiments, leaving astrophysics for the subsequent post.

Muon Anomalous Magnetic Moment

This experimental result is very often presented as a hint to physics beyond the Standard Model. For less oriented: there is nothing anomalous in the anomalous magnetic moment itself - it is a well-understood quantum effect that is attributed to virtual particles. But in the muon case, theoretical predictions slightly disagree with experiment. The E821 experiment in Brookhaven measured $a_\mu = (11 659 208 \pm 6)\cdot 10^{-10}$. The Standard Model accounts for all but $28\cdot 10^{-10}$ of the above, which represents a 3.4 sigma discrepancy.

The discrepancy can be readily explained by new physics, for example by low-energy supersymmetry or by new light gauge bosons mixing with the photon. But there is one tiny little subtlety. The Standard Model prediction depends on low-energy QCD contributions to the photon propagator that cannot be calculated from first principles. Instead, one has to use some experimental input that can be related to the photon propagator using black magic and dispersion relations. Now, the discrepancy between theory and experiment depends on whether one use the low-energy e+e- annihilation or the tau decays as the experimental input. The quoted 3.4 sigma arises when the electron data are used, whereas the discrepancy practically disappears when the tau data are used. It means that some experimental data are wrong, or some theoretical methods employed are wrong, or both.

In near future, a certain measurement may help to resolve the puzzle. The troublesome QCD contribution can be extracted from a process studied in BaBar, in which a photon decays into two pions (+ initial state radiation). There are rumors that the preliminary BaBar results point to a larger QCD contribution (consistent with the tau data). This would eradicate the long-standing discrepancy of the muon anomalous magnetic moment. But, at the same time, it would imply that there is a flaw in the e+e- annihilation data, which would affect other measurements too. Most notably, the electron data are used as an input in determining the hadronic contribution to the electromagnetic coupling, which is one of the key inputs in fitting the Standard Model parameters from electroweak observables. As pointed out in this paper, if the low-energy QCD contribution where larger than implied by the electron data, the central value of the fitted Higgs boson mass would decrease. Currently, the electroweak fit determines the Higgs boson mass as $77^{+28}{}_{-22}$, which is already uncomortable with the 114 GeV direct search limit. Larger QCD contributions consisent with the tau data would increase this tension. Interesting times ahead.

Forward-Backward Asymmetry

CERN's LEP experiment has been desperately successful: it beautifully confirmed all theoretical predictions of the Standard Model. The mote in the eye is called $A_{fb}^b$: the forward-backward asymmetry in decays of the Z-boson into the b-quarks. This observable measures the asymmetry in the Z boson interactions with left-handed b-quarks and right-handed ones. The results from LEP and SLD led to a determination of $A_{FB}^b$ that deviates 3 sigma from the Standard Model prediction. On the other hand, the total decay width of the Z-boson into the b-quarks (summarized in the so-called Rb) seems to be in a good agreement with theoretical predictions.

One possible interpretation of these two facts is that the coupling of the Z-boson to the right-handed b-quarks deviates from the Standard Model, while the left-handed coupling (who dominates the measurement of Rb) agrees with the Standard Model. At first sight this smells like tasty new physics - the Zbb coupling is modified in many extensions of the Standard Model. In practice, it is not straightforward (though not impossible) to find a well-motivated model that fits the data. For example, typical Higgsless or Randall-Sundrum models predict large corrections to the left-handed b-quark couplings, and smaller corrections to the right-handed b-quark couplings, contrary to what is suggested by the electroweak observables.

Maybe this discrepancy is just a fluke, or maybe this particular measurement suffers from some systematic error that was not taken into account by experimentalists. But the funny thing is that this measurement is usually included in the fit of the Standard Model parameters to the electroweak observables because...it saves the Standard Model. If $A_{FB}^b$ was removed from the electroweak fit, the central value of the Higgs boson would go down, leading to a large tension with the 114 GeV direct search limit.

Bs Meson Mixing Phase

The results from BaBar and Belle led to one Nobel prize and zero surprises. This was disappointing, because flavor-changing processes studied in these B-factories are, in principle, very sensitive to new physics. New physics in sd transitions (kaon mixing) and bd transitions is now tightly constrained. On the other hand, bs transitions are less constrained, basically because the B-factories were not producing Bs mesons. This gap is being filled by the Tevatron who has enough energy to produce Bs mesons and study its decays to J/psi. In particular, the mass difference of the two Bs eigenstates was measured and a constraint on the phase of the mixing could be obtained. The latter measurement showed some deviation from the Standard Model prediction, but by itself it was not statistically significant.

Later in the day, the UTfit collaboration combined the Bs meson data with all other flavor data. Their claim is the Bs mixing phase deviates from the Standard Model prediction at the 3 sigma level. This could be a manifestation of new physics, though it is not straightforward to find a well-motivated model where the new physics shows up in bs transitions, but not in bd or sd transitions.

NuTeV Anomaly

Nu-TeV was an experiment at Fermilab whose goal was a precise determination of the ratio of neutral current to charged current reactions in neutrino-nucleon scattering. Within the Standard Model, this ratio depends on the Weinberg angle $\sin \theta$. It turned out that the magnitude of the Weinberg angle extracted from the NuTeV measurement deviates at the 3 sigma level from other measurements.

It is difficult to interpret this anomaly in terms of any new physics scenario. A mundane explanation, e.g. incomplete understanding of the structure of the nucleons, seems much more likely. The dominant approach is to ignore the Nu-TeV measurement.

HyperCP Anomaly

This measurement was sometimes mentioned in the context of the CDF anomaly, because the scales involved are somewhat similar. Fermilab's HyperCP experiment found evidence for decays of the hyperon (a kind of proton with one s quark) into one proton and two muons. This by itself is not inconsistent with the Standard Model. However, the signal was due to three events where the invariant mass of the muon pair was very close to 214 MeV in each case, and this clustering appears very puzzling.

The HyperCP collaboration proposed that this clustering is due the fact that the hyperon first decays into a proton and some new particle with the mass 214 MeV, and the latter particle subsequently decays into a muon pair. It is very hard (though, again, not impossible) to fit this new particle into a bigger picture. Besides, who would ever care for 3 events?

GSI Anomaly

For dessert, something completely crazy. The accelerator GSI Darmstadt can produce beams of highly ionized heavy atoms. These ions can be stored for a long time and decays of individual ions can be observed. A really weird thing was noticed in a study of hydrogen-like ions of praseodymium 140 and promethium 142. The time-dependent decay probability, on top of the usual exponential time-dependence, shows an oscillation with a 7s period.

So far the oscillation remains unexplained. There were attempts to connect it to neutrino oscillations, but this has failed. Another crazy possibility is that the ions in question have internal excitations with a small $10^{-15}$ eV mass splitting.

Monday 10 November 2008

Hidden Valley Revealed?

As for today, the CDF anomaly has no convincing explanation. Strangely enough, HEP-ph is not flooded by new physics models (yet?), maybe because a down-to-earth explanation appears far more likely to most of us. The situation will clarify when D0 presents their own analysis of the analogous multi-muon events. I'm confident that D0 is working hard now, since the prospect of kicking the CDF butt (if the anomaly is a detector effect) must be tempting for them. While waiting for D0, one may wonder if there is a new physics scenario that could address the signatures reported by CDF. The so-called hidden valley scenario was pointed here and there, so I thought it would useful to explain the idea behind that romantic name.

Hidden valley refers to a large class of theories which, apart from the Standard Model, contain a hidden sector with a low mass scale. Here, low means no more than 100 GeV; could be 10 GeV, could be 1 GeV... In order to explain why the new particles were not copiously produced at LEP,
the hidden sector must be very weakly coupled to the SM. For example, the interactions between the Standard Model and the hidden valley might be mediated via a new U(1) gauge symmetry under which both sectors are charged. The Higgs boson could also be a mediator between the two sectors. If the mass of the mediator is large (more than, say, 100 GeV), then the interactions between the two sectors is very weak at low energies. This is illustrated in the picture on the right. While LEP has not enough energy to climb over the potential barrier produced by the large mass of the mediator, the LHC is powerful enough to overcome the barrier and explore the new sector. The Tevatron is not in this picture because everybody thought it was already passè.

What makes the hidden sector? Basically, the sky and your imagination is the limit. Below, I will talk about one particular scenario that is especially interesting from the point of view of collider studies. It might be that the hidden sector is described by a strongly interacting theory somewhat resembling our QCD. That is to say, there are hidden quarks confined by hidden strong forces that binds them into hidden mesons, hidden pions and similar stuff. A particle collision in our collider, after crossing the potential barrier, would produce a pair of hidden quarks who subsequently hadronize and cascade-decay to lighter hidden hadrons. But one crucial difference between our QCD and the hidden QCD is that the latter does not have a stable particle at the end of the decay chain (or at least, not all the decay chains end in a stable hidden particle) so that the hidden stuff eventually decays back into the Standard Model particles. Because of the small interaction strength between the two sectors, some hidden hadrons may have a relatively long life-time, which leads to highly displaced vertices in our detector. Often, with large multiplicities of soft particles in jets. Sounds familiar?

The hidden-valley scenario was proposed more than 2 years ago by Matt Strassler and Kathryn Zurek. They didn't have a particular motivation in mind other than exploring exotic collider signatures (although strongly interacting hidden sectors are common in supersymmetric model-building, and no commandment forbids the mass scale in those sectors be GeV-ish, they could also host the dark matter particle). This approach represents the recent change of season in particle theory. In the old days, the particle folk touched only to those models that were "strongly motivated" or "leading candidates". The outcome may prove valuable to posterity, especially to anthropologists. With the LHC approaching, the emphasis has shifted to interesting collider signatures, and fundamental motivations are no longer mandatory. It may well be that shooting at random will prove more successful than following our theoretical prejudices. In this respect hidden valleys have much in common with unparticles, who are similarly unmotivated. The reason why unparticles received much more attention is that they are quite sharply defined, and for this reason they are more comfortable as a bandwagon. On the other hand, hidden valley is a very wide concept, and by the time the model B-71 variant 69 is discussed, the audience is switching to online newspapers. But things may change soon, if the CDF anomaly won't go way...

It should be clear, however, that for the moment there is no hidden-valley model that would explain the CDF anomaly. The biggest problem is the large number of anomalous events reported by CDF. Given that CDF sees some $10^5$ anomalous events, the cross section for the production of the hidden valley particles should be larger than 100 pb. That's already a lot - much more than the standard Higgs production cross section at the Tevatron, and of the similar of magnitude as the production b-quark pairs. Moreover, the required cross section may be even more ridiculous if not all decays of the unknown particles go into muons. For example, in this attempt to fit the signal with 8-tau decays the estimate for the cross-section is 100nb. This seems to be at odds with the assumption that the hidden sector is very weakly coupled to the Standard Model. Furthermore, CDF sees no sign of resonant production, which would be expected if the mediator between the two sectors is not too heavy. Clearly, there's some work to do, for experimenters and theorists alike.

Update: As if a blog post were not enough ;-), here is Matt's brand new paper discussing possible connections of the hidden-valley scenario to the CDF anomaly.

Friday 31 October 2008

On CDF Anomaly

Mhm, it seems that I chose the wrong side of the Atlantic. First, the LHC produces a big firework display instead of small black holes, and then the CDF collaboration at the Tevatron discovers new physics. About the CDF anomaly in multi-muon events, see Tommaso's post or the original paper. Together with a few CERN fellows we had an impromptu journal club today, and we have reached the conclusion that, well, we don't know :-). The anomaly occurs in a theoretically difficult region, the B-baryon spectrum is poorly known, the local Monte Carlo magicians are very sceptical about modeling the b-quarks, etc, etc. It does not mean, of course, that one should shrug it off. Whether we want it or not, the CDF anomaly will dominate particle model-building for the next few months.

Meanwhile, there is already one model on the market that, incidentally:-), looks relevant for the anomaly: SuperUnified Theory of Dark Matter. One can immediately cook up $e^N$ variations of that model, but there seem to be 3 basic building blocks:
1) The "visible" sector that consists of the usual MSSM with the supersymmetry breaking scale $M_{MSSM} \sim$ few hundred GeV.
2) The dark sector with a smaller supersymmetry breaking scale $M_{dark} \sim $ GeV. It includes a dark gauge group with dark gauge bosons and dark gauginos, a dark Higgs that breaks the dark gauge group and gives the dark mass to the dark gauge bosons of order 1 dark GeV. In fact it's all dark.
3) The dark matter particle that is charged under the dark group and has a large mass, $M_{DM} \sim $ TeV. Unlike in a typical MSSM-like scenario, dark matter is not the lightest supersymmetric particle, but rather some new vector-like fermion whose mass is generated in the similar fashion as the MSSM mu-term.

The dark group talks to the MSSM thanks to a kinetic mixing of the dark gauge bosons with the Standard Model photon, that is via lagrangian terms of the type $f_{\mu\nu} F_{\mu \nu}$. Such mixing terms are easily written down when the dark group is U(1), although for non-abelian gauge groups there is a way to achieve that too (via higher-dimensional operators). Once the dark gauge boson mixes with the photon, it effectively couples to the electromagnetic current in the visible sector. Thanks to this mixing, the dark gauge boson can decay into the Standard Model particles.

The SuperUnified model is tailored to fit the cosmic-ray positron excess PÀMELA and ATIC/PPB-BETS. The dark matter particle with a TeV scale mass is needed to explain the positron signal above 10 GeV (as seen by PAMELA) all the way up to 800 GeV (as suggested by ATIC/PPB-BETS), see here. The dark gauge bosons with a GeV mass scale play a two-fold role. Firstly, they provide for a long range force that leads to the Sommerfeld enhancement of the dark matter annihilation rate today. Secondly, the 1 GeV mass scale ensures that the dark matter particle does not annihilate into protons/antiprotons or heavy flavors, but dominantly into electrons, muons, pions and kaons. The second point is crucial to explain why PAMELA does not see any excess in the cosmic-ray antiprotons. Supersymmetry does not play an important role in the dynamics of dark matter, but it ensures "naturalness" of the 1 GeV scale in the dark sector, as well as of the electroweak scale in the visible sector. I guess that analogous non-supersymmetric constructions based, for example, on global symmetries and axions will soon appear on ArXiv.

What connects of this model to the CDF anomaly is the prediction of "lepton jets". In the first step, much as in the MSSM, the hadron collider produces squarks and gluinos that cascade down to the lightest MSSM neutralino. The latter mixes into the dark gauginos, by the same token as the dark gauge boson mixes with the visible photon. The dark gaugino decays to the dark LSP and a dark gauge boson. Finally, the dark gauge boson mixes back into the visible sector and decays into two leptons. At the end of this chain we obtain two leptons with the invariant mass of order 1 GeV and a small angular separation, the latter being due the Lorentz boost factor $\gamma \sim M_{MSSM}/M_{dark} \sim 100$.

The perfect timing of the "lepton jets" prediction is unlikely to be accidental. A new spying affair is most welcome, now that the paparazzi affair seems to by dying out. While waiting for CDF to find the traitor and hang him on the top pole, I keep wondering if the SuperUnified model does indeed explain the CDF excess. If you take a look at the invariant mass distribution of the anomalous muon-pair events (right panel) it does not resemble a 2-body decay of a narrow-width particle (for comparison, admire the J/Psi peak in the left panel), which it should if the muons come from a decay of the dark gauge boson. Or am I missing something? Furthermore, it has been experimentally proved that bosons are discovered in Europe, while only fermions can be discovered in the US. This is obviously inconsistent with the Tevatron finding the dark gauge boson ;-)

Thanks to Bob, Jure and Tomas for the input.
See also Lubos' post on the SuperUnified model.
For more details and explanations on the CDF anomaly, see the posts of Peter and Tommaso and John.

Thursday 30 October 2008

PAMELA's coming-out

Yesterday, PAMELA finally posted on ArXiv her results on the cosmic-ray positron fraction. In the last months there was a lot of discussion whether it is right or wrong to take photographs of PAMELA while she was posing. Here at CERN, people were focused on less philosophical aspects: a few weeks ago Marco Cirelli talked about the implications for dark matter searches, and Richard Taillet talked about estimating the positron background from astrophysical processes in our galaxy. Finally, PAMELA had her coming-out seminar two days ago. PAMELA is a satellite experiment that studies cosmic-ray positrons and anti-protons. She has a better energy reach (by design up to 300GeV, although the results presented so far extend only up to 100 GeV) and much better accuracy than the previous experiments hunting for cosmic anti-matter. Thanks to that, she was able to firmly establish that there is an anomaly in the positron flux above 10 GeV, confirming the previous hints from HEAT and AMS.

Here are the PAMELA positron data compared with the theoretical predictions. The latter assume that the flux is dominated by the secondary production of positrons due to collisions of high-energy cosmic rays with the interstellar medium. The two lines are almost perpendicular to each other :-). In fact, the discrepancy below 10 GeV is not surprising, and is interpreted as being due to solar modulation. It turns out that the solar wind modifies the spectrum of low-energy cosmic rays, and in consequence the flux depends on solar activity which changes in the course of the 22-years solar cycle. Above 10 GeV the situation is different, as solar modulation is believed to produce negligible effects. Even though the secondary production of positrons has large theoretical uncertainties, one expects that it decreases with energy. Such a power-law decrease has been observed in the flux of anti-protons who also may originate from secondary production. The positron fraction, instead, significantly increases above 10 GeV.

Thus, PAMELA shows that the secondary production is not the dominant source of high-energy positrons. The excess can be due to astrophysical sources, for example young near-by pulsars have been proposed as an explanation. But what makes particle physicists so aroused is that dark matter annihilation is a plausible explanation too. It might be that PAMELA is a breakthrough in indirect dark matter searches. It is less known that there are other experiments that see some excess in the cosmic ray flux. Most interestingly, two balloon-borne experiments called ATIC and PPB-BETS see an excess in the total electron+positron flux (they cannot distinguish the two) with a peak around 700 GeV. This adds to the EGRET gamma-ray excess at a few GeV, and to the WMAP haze - an excess of diffuse microwave background from the core of our galaxy.

A dark matter candidate that fits the PAMELA excess, must have rather unexpected properties. If the observed dark matter abundance has a thermal origin, the dark matter annihilation cross section naively seems too small to explain the observed signal. As usual, theorists have magic tricks to boost the annihilation rate today. One is using the so-called boost factor: if dark matter clumps, its density is locally higher than average, and then the average annihilation rate also increases with respect to the case of a uniform distribution. However, this does not save the day for the most popular dark matter candidates. For example, the MSSM neutralino typically requires a boost factor of order a few hundred, which is probably stretching the point. The latest trick is called the Sommerfeld enhancement: if the dark matter particle feels some attractive long range forces (other than electromagnetism, of course), a pair of particles may form a bound state, which enhances the annihilation rate.

Another challenge for particle models is the fact that PAMELA sees no excess in the antiproton flux. This means that the dark matter particle must be hadrophobic, that is to say, it should decay preferentially into leptons. Again, the most popular dark matter candidates, like the MSSM neutralino, do not satisfy this criterion. However, particle models compatible with the PAMELA data do exist, for example Minimal Dark Matter (though this one is not compatible with the ATIC/PPB-BETS peak), or recent Exciting Dark Matter.

So, it seems, we have to wait and see till the smoke clears up. Certainly, a single indirect detection signal has to be taken with all due scepticism (so many have died before). Only combined efforts of several experiments can lead to a convincing conclusion. As for the moment, if somebody pointed a gun to my face and made me choose one answer, I would probably go for an astrophysical explanation. On the other hand, if the PAMELA excess is really a manifestation of dark matter, the LHC could concentrate on more interesting issues than discovering and undiscovering the MSSM. It seems that astrophysicists have at least one more year to sort this thing out by themselves.

Tuesday 21 October 2008

A Long-Expected Party

Today, soon after publishing the damage report, CERN is organizing the LHC Inauguration Ceremony. Given that the restart date is unclear (in private conversations, the estimate September 2009 appears most often), some lesser souls may feel dissonance. However, CERN is here to push the frontiers of science, and organizing an opening of a damaged accelerator is truly innovatory. The current DG himself must have had some doubts, as he cautiously writes "Dear Council Delegates, I would like to thank you for your reactions to my suggestion to maintain the LHC Inauguration Ceremony on October 21 2008, as initially foreseen...". Fortunately, the diplomats vehemently supported the idea, since they were already promised molecular cuisine.

Thus, CERN is overrun today by men in suits normally unseen at this latitude. This must be the first time foreign diplomats ever visit Geneva, so that unprecedented security measures were taken. All CERN parkings had to be emptied from cars, and those that remained are likely to be exploded. The roads connecting CERN to Geneva and France are now closed. The public buses that normally take this route run with a police escort, and they are not allowed to stop near CERN. It is not clear if the trespassers will be shot, or only impaled.

If you're curious what's on the menu, there will be a webcast here. This time there will be no live commentary, unless something funny happens. There is a rumor that during the ceremony the current DG will give a speech and vanish.

Monday 29 September 2008

Gia's Hairy Black Holes

Now that the hope for a quick LHC start-up has literally vaporized, I have six more months to play with particle theory without worrying about experimental constraints. This is a good moment to return to the workshop on black holes that is still trickling here at CERN. Last Friday, Gia Dvali was presenting his ideas concerning gravity theories with a large number of particle species. The subject is more than one year old and I already wrote about it in this blog. Gia is trying to derive general features of field theory coupled to gravity, without making specific assumptions about the underlying quantum gravity theory. To this end he produces gedanken black holes (at the LHC, or anywhere in the Universe) and employs unitarity plus general properties of semi-classical black holes to obtain some interesting conclusions.

In particular, Gia argued one year ago that in a theory with N species of particles there is a bound on the fundamental scale $M_*$ where the gravitational interactions become strong. Namely, the bound reads $M_* \leq M_P/\sqrt{N}$ where $M_P \sim 10^{19}$ GeV is the Planck scale as we know. This opens a possibility to solve the hierarchy problem by postulating $N = 10^{32}$ new particle species at TeV. One can immediately see the analogy to the ADD (as in Arkani-Hamed, Dimopoulos, Dvali) large extra dimensions. In ADD, the relation between the fundamental scale and the Planck scale is $M_P^2 = M_*^2 (M_* R)^n$, where n and R are the number and the radius of the extra dimensions. This relation is in fact equivalent to $M_* = M_P/\sqrt{N}$ because in ADD $(M_* R)^n$ is just the number of graviton KK modes below the cutoff of the theory. Gia argues that the ADD solution to the hierarchy problem is just one example in a larger class: gravity must become strong well below the Planck scale whenever there is a large number of particles species, regardless whether extra dimensions exist or not. The fact that in ADD the multitude of particles species are Kaluza-Klein modes of a higher dimensional graviton is just a red herring.

After the initial proposal Gia has posted several papers further developing this idea. On Friday Gia talked mostly about the consequences for black hole physics summarized in this paper. It turns out that micro black holes in theories with a large number of species have peculiar properties that make them quite distinct from ordinary black holes in Einsteinian gravity. First of all, small black holes with sizes of the order of $M_*^{-1}$ are hairy. This Freudian feature can be argued as follows. Consider a collision of a particle and an anti-particle of one species at center-of mass energies of order $M_*$. The production rate of micro black holes in that collision must also be of order $M_*$ because it is the only scale available. By unitarity, the decay into the same species must also proceed at the rate $M_*$. However, this cannot be true for the decay in the remaining N-1 species: the decay rate can be at most $\sim M_*/N$, while assuming a faster decay rate leads to a contradiction. For example, black hole exchange would lead to a too fast growth of scattering amplitudes that would be at odds with the assumption that the cut-off of the theory is at $M_*$. Thus, the micro black holes in a theory with N species are highly non-democratic, and they need to carry a memory of the process in which they were produced.

As black holes grow heavier and older they should start losing their hair. The scale where the Einsteinian behavior is recovered cannot however be smaller than $M_* N$. Thus, black holes in the mass range $M_P/\sqrt{N} < M_{BH} < M_P \sqrt{N}$ must be non-standard, hairy and undemocratic. The proper black hole behavior, which entails democratic decay to all available species via Hawking radiation, can be expected only for heavier black holes. Again, these properties can be readily understood in the specific example of ADD large extra dimensions as a consequence of the geometric structure of the extra dimensions (for example, the crossover scale to the Einsteinian behavior is related to the radius of the extra dimension). Gia's arguments just generalize it to any theory with a large number of particle species. It seems that some kind of "emergent geometry" and "localization" must be a feature of any consistent low scale quantum gravity theory.

Of course there is a lot of assumptions that enter in that game (no black hole remnants, for example), and it is not unthinkable that the true quantum gravity theory may violate some of these assumptions. Nevertheless, I find amusing that such simple hand-and-all-body-waving arguments lead to quite profound consequences.

More details in the paper.

Saturday 20 September 2008

AdS/CFT goes cold

Last week Dam Son gave two nice talks about phenomenological applications of AdS/CFT:
one about heavy ions, and the other about non-relativistic conformal field theories (CFTs). While the former application is widely discussed in pubs and blogs, the latter is a relatively new development. It seems that, after having entrenched in the heavy ion territory, particle theory has launched another offensive on the unsuspecting condensed matter folk. Not later than yesterday I saw two new papers on the subject posted on ArXiv.

AdS/CFT as we know it relates strongly coupled gauge theories to gravity theories in one more dimension. In the original tables received at Mount Sinai by Maldacena it speaks about highly symmetric and all-but-realistic theories: N = 4 super Yang-Mills on the gauge theory side and 10D type IIB supergravity in $AdS_5\times S_5$ background on the gravity side. Later, the correspondence was vulgarized to allow for phenomenological applications. In particular, some success was reported in postdicting meson spectra of low-energy QCD and explaining large viscosity of the quark-gluon plasma. Heavy ion collisions are total mess, however, and one would welcome an application in the field where the experimental conditions can be carefully tuned. Condensed matter physics enjoys that privilege and, moreover, laboratory systems near a critical point are often described by CFT. The point is that in most of the cases these are non-relativistic CFTs.

A commonly discussed example of a condensed matter system is the so-called fermions at unitarity (what's in the name?). This system can be experimentally realized as trapped cold atoms at the Feshbach resonance. Theoretically, it is described using a fermion field with the non-relativistic free lagrangian $\psi^\dagger \pa_t \psi - |\pa_x\psi|^2/2m$ and short range interactions provided by the four-fermion term $c_0 (\psi^\dagger \psi)^2$. The experimental conditions can be tuned such that $c_0$ is effectively infinite. In this limit the system has the same symmetry as the free theory and, in particular, it has scale invariance acting as $x \to \lambda x$, $t \to \lambda^2 x$. The full symmetry group includes also the non-relativistic Galilean transformations and special conformal transformations, and it is called the Schrodinger group (because it is the symmetry group of the Schrodinger equation). Most of the intuition from relativistic CFT (scaling dimensions, primary operators) carries over to the non-relativistic case.

The most important piece of evidence for the AdS/CFT correspondence is matching of the symmetries on both sides of the duality. For example, the relativistic conformal symmetry SO(2,4) of the SYM gauge theory in 4D is the same as the symmetry group of the AdS spacetime. In the case at hand we have a different symmetry group so we need a different geometric background on the gravity side. The Schrodinger group Sch(d) in d spatial dimensions can be embedded in the conformal group SO(d+2,2). For the interesting case d = 3 this shows that one should look for a deformation of the AdS background in six space-time dimensions, one more than in the relativistic case. In the paper from April this year, Dam Son identified the background with the desired symmetry properties. It goes like this
$ds^2 = \frac{-2 dx^+ dx^- + dx^i dx^i + dz^2}{z^2} - \frac{(dx^+)^2}{z^4}$.
The first term is the usual AdS metric, the last term is a deformation that reduces the symmetry down to the Schrodinger group Sch(d). The light-cone coordinate $x^-$ is compactified, and the quantized momentum along that coordinate is identified with the mass operator in the Schrodinger algebra.

So, the hypothesis is that fermions at unitarity have a dual description in terms of a gravity theory on that funny background. Many details of the correspondence are still unclear. One obstacle seems to be that fermions at unitarity do not have an expansion parameter analogous to the number of colors of relativistic gauge theories. A more precise formulation of the duality is clearly needed.

Friday 19 September 2008

Quench

Yesterday, CERN was buzzing with rumours that the first LHC collisions should take place during the week-end. This morning, however, there was a major quench in Sector 3-4. As you can see here, some magnets in the affected sector are now at almost 100K. LHC-progress addicts report that pretty scaring entries were appearing in the LHC logbook this morning (fire alarm, power failure, helium leaking into the tunnel), though all the record seems to be deleted now. Although there has been no official news so far, this problem appears to be serious (unlike all the problems reported by the media earlier this week) and may cause a lot of delay. It is not certain if operation will be resumed before the winter shutdown. So you can relax for the moment.

Update: there is a press release explaining what happened:
During commissioning (without beam) of the final LHC sector (sector 34) at high current for operation at 5 TeV, an incident occurred at mid-day on Friday 19 September resulting in a large helium leak into the tunnel. Preliminary investigations indicate that the most likely cause of the problem was a faulty electrical connection between two magnets which probably melted at high current leading to mechanical failure(...)
The crucial information is
(...)it is already clear that the sector will have to be warmed up for repairs to take place. This implies a minimum of two months down time for the LHC operation...
In fact, warming up and cooling of one sector usually takes 3 months, so there is little hope for a beam before the winter shutdown (end of November).

Update 2: :-(

Friday 12 September 2008

What will the LHC discover

The excitement generated by the LHC kick-off last week is still in the air. I'm beginning to realize that soon we will k.n.o.w. Which means that it's the last moment for gambling and wild guessing. Here are my expectations. The probabilities were computed using all currently available data and elaborated Bayesian statistics.

Higgs boson. Probability 80%
Peter Higgs' kid is ugly and problematic, however his big advantage is that he does his job right. Firstly, he knows how to break electroweak symmetry in such a way that the scattering amplitudes of W and Z bosons remain unitary at high energies. Secondly, if he is not much heavier than 100 GeV, he is consistent with stringent precision tests performed by LEP and Tevatron. No one else can achieve both without complicated gymnastics. That's why Higgs is the safest bet.

Non-SM Higgs boson. Probability 50%
The Standard Model uniquely predicts the couplings of the Higgs to all fermions and gauge bosons. From experience, these couplings are very sensitive to new physics in any form.That's why a precise measurement of the Higgs production cross section and all possible decay rates may be far more exciting than the discovery itself.

New Beyond SM Particles. Probability 50%
That's what particle physics is about, isn't it ;-) Almost any extension of the Standard Model that explains electroweak symmetry breaking predicts some particles in the TeV range. So it seems a good bet that we will see some of the junk. The question is if we will be able to make sense of the pattern that will reveal...

Strong Interactions. Probability 20%
Nature has repeated this scenario all over again: interactions between fundamental constituents become strong and new collective degrees of freedom emerge. Condensed matter physicists see it everyday in their laboratories. In particle physics, the theory of quarks and gluons knowns as QCD at low energies undergoes a transition to a confining phase where it is more adequately described by mesons and baryons. It is conceivable that some of the Standard Model particles also emerge in this manner from a TeV scale strongly interacting dynamics. The problem is that we should have already seen the hints of the composite structure in low-energy precision tests, flavor physics and so on, but we see none of that. The reason why the probability for this scenario remains relatively high is our shameful ignorance of strongly interacting dynamics -- we might have easily missed something.

Dark matter. Probability 5%
All hopes lie in numerology: a stable particle with a weak-scale mass and a typical weak annihilation cross-section of order 1 picobarn would have roughly the right thermal abundance to explain the observed dark matter abundance. If this is the right track, the LHC would grab the most important discovery in the history of collider physics. But we know dozens of other plausible scenarios where the dark matter particle is either too heavy or too weakly interacting to be discovered at the LHC.

Little Higgs and friends. Probability 1%
It is a plausible possibility that the Higgs boson is a pseudo-Goldstone boson whose mass is protected from radiative corrections by approximate global symmetries, a sort of mechanism we see at work in the pion sector of QCD. Proof-of-principle models have been constructed: Little Higgs and Gauge-Higgs unification scenarios. But they are all kind of elephants on elephants...

Supersymmetry. Probability 0.1%
Supersymmetry is just behind the corner. After the LHC she will just pick another corner to hide behind. Supersymmetry will of course be seen at the LHC, just like she was seen in all previous hadron colliders. But, once the data are well understood, she will take a leave and come back into hiding where she clearly feels more comfortable. Susy aficionados should not however be worried. The field will flourish as a new, vast and exciting parameter space above 3 TeV will open for exploration. The wealth of new experimental constraints from the LHC, satellite missions, and dark matter detection experiments will make the-allowed-parameter-space plots colorful and sexy.

Dragons.
Probability $e^{-S_{dragon}}$
This possibility was recently pointed out by Nima Arkani-Hamed. The laws of quantum mechanics allow anything to happen, albeit the probability may be exponentially suppressed for complicated (large entropy) objects. CERN officials maintain there is no imminent danger since the putative LHC dragons will be microscopic (small dragons have the smallest entropy, hence the largest probability to appear in particle collisions) and anyway they will quickly suffocate in the vacuum of the beam pipe. Some researchers, however, have expressed concerns that the dragons might survive, grow, burn ATLAS, kidnap ALICE and lock her in a tower. A more comprehensive study of the potential risks is underway.

Black Holes. Probability $0.1*e^{-S_{dragon}}$
Although microscopic black holes have smaller entropy than typical dragons, the advantage of the latter is that they are consistent with the established laws of physics, whereas TeV-scale black holes are not. There are many indirect arguments against TeV scale gravity, from precision tests, through flavor physics, to cosmology. Certainly, dragons are a bit safer bet.

Wednesday 10 September 2008

Day Zero Live

8.40. (Yawn). This is the day.
8.43. Yesterday was The End of the World Party downtown in Geneva. It was a quite success, though not as decadent as some might have hoped. On the picture, CERN theorists with WAGs.
8.46. It is certainly the beginning of the end of the world as we know it. Approximately in two years from now particle physics will be turned upside down. 99% of the currently fashionable particle physics models will go to trash. Maybe even 100% ?
8.50. The LHC is a black hole factory, and by exactly the same token it is also a time machine. From here one can peer into the future and read tomorrows newspaper. Here is a sample Thursday edition of an Italian newspaper (you may need to understand Italian and Italians to appreciate it):8.59 And here is the new logo of CERN:

9.05. The auditorium is 200% filled. The webcast does not seem to work. For the moment nothing's going on, just a scary movie on the big screen.
9.15. Lyn Evans explained the plan for the day. First, they are going to inject the beam at point 2 and dump it at point 3. Then they will remove the dumping block at point 3 and try to get to point 4. And so on, hopping octet by octet. When the tour is complete they will start circulating the beam.
9.25. They are injecting the beam and it reached point 3. Applause. People are still flowing toward the Auditorium. There are guards now at the entrance to turn back the crowd and avoid a stampede. Feels like Glastonbury 2000.
9.35. It is really amazing. 1000s of people are staring at the screen, after a few moment a dot appears in the middle and everybody's applauding. It must be the same feeling as watching a baseball game.
9.50. The beam is at point 5 now, and CMS may soon see some splashes. The control room looks like the NASA control room: topü figures staring with serious faces on blinking monitor screens, trying to make an impression everything's under control. Others obviously bored, since it's not allowed to play network games in the presence of TV cameras.
9.55. Meanwhile, the beam got to point 6. However it apparently needs some more manicure, pedicure and collimation before moving further along the ring. Good progress so far.
10.08. Point 7 reached. By the way, if you're having cold feet by now, this blog may provide reassurance.
10.12. Point 8, and then point 1. The circuit is almost complete. In a moment Atlas will see first events in their detector.
10.25. Last octet. Two dots on the screen, which means the beam has made a full clockwise circuit!!!
10.28. Well, well, it seems that the damn machine is working. That's quite unexpected.
10.40. Not much going on now. Music, champagne, interviews....If you are bored with this one, other live commentaries here, here and here.
11.01. In 1 hour or so they will try circulating the beam counterclockwise. Since parity is not conserved in the real world, things might be quite different in that case.
11.09. The picture of the first event in Atlas (thanks to Florian). No idea what's on it :-)
11.29 The tension has clearly dropped. For the last hours there's been only interviews (can't they think of other question than how do you feel?) and boring speeches. They should do something to pump it up. Like for example a mud fight between CMS and Atlas.
12.01. Lyn Evans and Robert Aymar are holding a triumphant press conference.
12.30. Yes! After the first beam we also got the first protester, who came all the way from Germany. He seems quite nice, and harmless, a bit confused too.
12.35. There's some problem with the cryogenic system of the magnets, so that the 2nd beam will be delayed.
13.55. Back after the lunch break. The magnets are cool again, the 2nd beam is being injected.
14.01. 2nd beam is at point 7, then point 6. OK, it's a bit less exciting than the first time...
14.15. Looks like they are having some problems with collimating the beam. They are stuck at point 6 until the beam gets smoothed out.
14.33. Operation resumed. They are at point 5 and passed the CMS detector. Everybody in the CMS control room felt a swoosh of wind.
15.02. Two spots on the screen! First full counterclockwise circuit. Applause, though shorter than for the first beam. It's always better to be the first than the second.
15.05. Robert Aymar said that it's working smooth as a roulette. I hope he didn't mean Russian roulette. Now they will try to get more than one circuit of the beam.
15.25. Basically, the plan for today has been accomplished. Today they will play a little bit more with the 2nd beam. The plan for the nearest feature: sustain the beam continuously, collide two beams at 450 GeV, accelarate the beam in the LHC ring.
16.35 Champagne in the control room. So there won't be more beam today ;-)
17.05. The first beam in the LHC: press release.
18.05. The webcast is now over. The end of the world live was a full success. Everything is going unexpectedly smoothly, so the first collisions may happen sooner than assumed. I'm going to sleep now, but the LHC is not: the work will continue tonight, and tomorrow is another working day (even though it's holiday in Geneva). What we were so excited about today, tomorrow will be just a boring routine. Good night and good luck.