Complexity is a measure of how much information a system “contains” and how much this information is structured. One could simply sum up the Shannon entropies of each variable and conclude that this is the total amount of information in a system. However, because variables can be correlated, they give rise to structure. Structure means the system can “do more” and, potentially, perform new functions. Structure is present everywhere in Nature.