The problem with dark matter at collider experiments is that it is dark, i.e. it doesn't show up in the detectors. To get around this problem we look for the production of other stuff as well as the dark matter itself. We can then tell if the dark matter is there by seeing an apparent violation of conservation of momentum; the missing momentum is carried away by the unobserved dark matter particles.
The traditional approach to these type of searches is to take a complete model of new physics (such as supersymmetry) and use that to model the production process. So in SUSY, we produce gluinos or squarks, which then go through a several-step decay producing dark matter and multiple Standard Model (SM) particles. Indeed, even today the signal "jets and missing transverse momentum" is considered a characteristic SUSY search.
However, a couple of years ago an alternative and somewhat opposite approach began to become popular.
The new idea was to consider a much simpler model, with the dark matter the only new object beyond the SM. In such a model all1 couplings of dark matter to the visible sector are so-called non-renormalisable operators, that is they are suppressed by a high energy scale. At such an energy our theory stops making sense; this means our theory is incomplete. However, it is still useful to work in such a theory as the physics below the cut-off scale is relatively independent of the physics above it. Specifically, many different high-energy models can have the same low-energy description. This means that any conclusions drawn from such simplified models will be robust and of broad applicability.
The number of possible models of this type is in principle infinite. However, if we restrict ourselves to models coupling two visible sector particles to two dark matter ones, a good description for almost all known complete models, there are only tens of possible choices. Clearly that is few enough that we can enumerate all possibilities.
Recall that we need to produce a SM particle in addition to the dark matter. This is most easily done through initial state radiation. The simplest final state is the completely invisible one:
|Production of DM χ from quarks q. (Quarks are the constituents of protons.)|
|As above, but with initial state radiation. From Zhou et al.|
As an aside, some of the earliest work I did in my Ph.D. was on stuff very similar to this, several years earlier. I really kick myself that I didn't think of it then, or get involved when the first few papers in this area came out.
This brings me to a paper from a couple of weeks ago that is something of a summary of the work in this area. By that I don't mean that it is a review, but rather that the authors combined the results of several different searches to place the most stringent limits on such simplified dark matter models. What makes this particularly easy is that all the different searches are sufficiently distinct that they are statistically independent. This means that the different limits can be combined using fairly elementary statistics instead of the full analysis tools of the LHC experimental collaborations. (Note that Zhou et al. are theorists.)
The increase in sensitivity by combining searches depends on model, but is tens of percent. Most cases are dominated by one particular signal, which is why you don't gain more. Still, it is an improvement, and an important one given where those limits are. Consider the following results from the paper:
1 This is not actually true. There are three possible exceptions, the so-called Higgs, vector and neutrino portals. However, none of them are acceptable for such simple models, as they would lead to too much dark matter in the present Universe.↩