Information and Entropy

As a physicist, I hope that we will be able to develop a general mathematical theory of complex systems. But complex systems don’t fit into traditional statistical physics theory. They are not in equilibrium, and are full of correlations and strong interactions. Complex systems can increase their complexity spontaneously over time. (Darwin’s theory of evolution is an explanation of one way that this can happen, for example.) This increase could be thought of as an increase in information content, since the more complex a system is, the more bits of information are needed to describe it. This is measured by Shannon entropy. But information theory doesn’t tell us how to relate this to other important quantities like energy. For that we should have been talking about thermodynamic entropy instead. Chris Adami has an interesting discussion on his blog about information entropy, thermodynamic entropy and how they’re related.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s