by Panos Charitos. Published: 28 March 2014

Constantino Tsallis introduced what is presently known as Tsallis entropy and Tsallis statistics, also known as q-statistics, in his 1988 paper "Possible generalization of Boltzmann–Gibbs statistics" published in the Journal of Statistical Physics. Tsallis-statistics is based on the idea of generalizing the Boltzmann-Gibbs entropy by using probabilities q; with the index q introducing a bias in the weight of the probability of microscopic events. Given the increasing importance of the Tsallis distribution in the domain of heavy-ion physics, Constantino Tsallis recently visited CERN and delivered a lecture during a Heavy-Ion Forum. We met him and discussed about the importance of q-statistics and the possible interpretations.




Is this the first time that you visit CERN?

I visited CERN invited by the organizers of the Heavy-Ion Forum and I must say that it has been a very interesting visit. I would like to take this opportunity and thank Yiota Foka and Urs Wiedemann for organizing this wonderful workshop. During my stay I had the chance to visit the ALICE cavern and see the detectors and the electronics that were built for these experiments.

Is there a growing interest over the last few years in the q distribution?

Over the recent years there has been an increasing use by the LHC experiments of the q-statistics and particularly the distribution associated with a stationary state within q-statistics. Q-statistics seems to describe very well the transverse momentum distributions of all different types of hadrons. All four LHC experiments have published results of these distributions that are well fitted by the q-exponential function. The resulting value of q is around 1.15; a neat departure from q=1 that corresponds to the Boltzmann-Gibbs distribution.

This result means that the stationary states of the particles before the hadronization are not in thermal equilibrium. Nevertheless the distribution is very robust and practically the same for different hadrons spanning a range of different energies.

Perhaps, one of the most impressive results, published a few months ago, is the measurement of the p_T distribution over a logarithmic range of 14 decades. It was found that the same expression of a q exponential q=1.15, fits the data over the full range of these fourteen decades. A theory that fits a range of couple of decades is already very interesting but fitting such a large range of decades, with the same distribution, is surprisingly rare.

Think for example Einstein’s and Newton’s equations of energy. When you compare them for protons with the highest cosmic ray energies using measurements from the AUGER project, the expression of Einstein is verified along eleven decades. This gives you an idea of how difficult is to measure experimentally a distribution over fourteen decades as it is the case for ALICE, ATLAS and CMS that all fit well with a value of q=1.15

What is the reason for that?

There is not a detailed explanation but the fact itself gives an idea of what is the physical scenario within which one should try to establish a specific model. The robustness of this distribution –that is not in thermal equilibrium- shows that it is a long-lasting stationary state or a quasi-stationary state. In turn this indicates that between the elements of the system there are very strong correlations.

If only weak correlations were taking place, like in the molecules of the air in this room, then you would expect these states to be in thermal equilibrium and hence the value of q would equal one. However, this is not the case and points to strong correlations with a very strong hierarchical structure. This could be either long memory effects at very elementary level or important long-distance correlations or both. This provides a framework of thinking about the possible detailed mechanism.

Is this mechanism within the framework of QCD?

There are reasons to believe that within QCD the value of q can be calculated. This calculation is very difficult. Nobody has done it before and I am not sure whether is tractable. In principle it should be obtained within QCD or another strongly coupled theory but this refers only to the value of q. When we turn our attention to the entire distribution then we need to take account other parameters like the effective temperature.

What can we learn from new data?

I think that it would be interesting to check whether this q statistical frame could also describe not only the transverse momentum distribution but also the rapidity distribution. There are a few indications in the literature that the rapidity distribution is also described by q statistics but it’s interesting to explore whether this is the case or not and what would be the value of q. Personally, I think that q will have a different value but still related to the value of the transverse component.

Was it expected that the value of q would be the same for different hadrons?

In the beginning it was a big surprise. An interesting effect is how q changes with the collision energy. Below TeV energies, the value of q increases very slowly and we can hypothtically assume that when the energy is increased to infinity the value of q will reach an asymptotic value close to 1.21. That value of q coincides with the value that q has when we fit the cosmic ray fluxes with q-statistics. In these observations q has a value that is very close to the very high-energy limit of what is currently observed at the LHC. These connections between what is observed at the LHC and what is measured in observations of cosmic rays is fascinating.

Do you have a scenario for this?

There is a mathematical feature that is very suggestive. If you sum random variables, that are independent or nearly independent under certain general conditions, the attractor of the sum is a Gaussian. This is the so-called Central Limit Theorem (CLT), one of the most important theorems in the theory of probabilities.

As a consequence of this theorem, in nature we find many Gaussians. Under some circumstances when the variance associated of those random variables is not finite the attractor is not Gaussian but a Levy distribution. Both Gaussian and Levy distributions are related to independent random variables. However, if the variables are strongly correlated the attractor is a q-gaussian that also has a tail described with a power law but the central and intermediate regions of the distribution are very different from those corresponding to the Levy distribution.

Q-Gaussian and Levi distributions have different bodies although they both end with a power law. The CLT does not tell you that the attractor ends like a Gaussian but that it is a Gaussian everywhere. The generalized central limit theorems show that the attractors are Q-gaussians and that they are easily found in nature like it happens with the gaussians. However, the difference is that gaussians are found when the random variables are independent while q-gaussians are found when random variables are strongly correlated in a specific way related to some kind of hierarchical structure.

Q–gaussians and q-exponentials can be found in the solar wind, cold atoms, granular matter and also in high-energy collisions at CERN. The ubiquity of this distribution found in so many different systems is consistent with the CLT and the observations from the LHC experiments fit quite well with this scenario.

What do you mean by referring to a hierarchical structure?

I will try answering your question through an example. Suppose that you have a system with five hundred particles and you have the probabilities of the different states of those particles. Suppose that your detector can see only 499 of those 500 while those 499 still see the 500 particles. With the theory of probability you can trace over the variable that you cannot see and calculate its marginal probability; you make the sum over the random variable that you cannot see and get a result that you compare to the experimental result of your detector.

Now suppose that you have another system with 499 particles so that your detector sees them all. If the distribution that appears in the system has the same form as the marginal distribution of the system of 500 particles – where one has been traced up- you have a kind of hierarchical structure. This is what happens in renormalization group approaches where you eliminate degrees of freedom and the shape remains essentially the same. This is the type of things that I would expect to happen in these experiments at CERN.

At the workshop there was an interesting discussion related to the name of those distributions as several collaborations and other facilities honor me and call them the Tsallis statistics while I simply call them q-exponentials or q-distributions. What is important is that you shouldn’t confuse the q-distribution with the Levy distribution and avoid using the term Levy-Tsallis because such a function does not exist.