As a physicist, I hope that we will be able to develop a general mathematical theory of complex systems. But complex systems don’t fit into traditional statistical physics theory. They are not in equilibrium, and are full of correlations and strong interactions. Complex systems can increase their complexity spontaneously over time. (Darwin’s theory of evolution is an explanation of one way that this can happen, for example.) This increase could be thought of as an increase in information content, since the more complex a system is, the more bits of information are needed to describe it. This is measured by Shannon entropy. But information theory doesn’t tell us how to relate this to other important quantities like energy. For that we should have been talking about thermodynamic entropy instead. Chris Adami has an interesting discussion on his blog about information entropy, thermodynamic entropy and how they’re related.