The global minima of the communicative energy of natural communication systems
Until recently, models of communication have explicitly or implicitly assumed that the goal of a communication system is just maximizing the information transfer between signals and 'meanings'. Recently, it has been argued that a natural communication system not only has to maximize this quantity but also has to minimize the entropy of signals, which is a measure of the cognitive cost of using a word. The interplay between these two factors, i.e. maximization of the information transfer and minimization of the entropy, has been addressed previously using a Monte Carlo minimization procedure at zero temperature. Here we derive analytically the globally optimal communication systems that result from the interaction between these factors. We discuss the implications of our results for previous studies within this framework. In particular we prove that the emergence of Zipf's law using a Monte Carlo technique at zero temperature in previous studies indicates that the system had not reached the global optimum.