We pose the central problem of defining a measure of complexity, specifically for spatial systems in general, city systems in particular. The measures we adopt are based on Shannon’s (1948) definition of information. We introduce this measure and argue that increasing information is equivalent to increasing complexity and we show that for spatial distributions, this involves a trade-off between the density of the distribution and the number of events that characterize it; as cities get bigger and are characterized by more events – more places or locations, information increases, all other things being equal. But sometimes the distribution changes at a faster rate than the number of events and thus information can decrease even if a city grows.
Entropy, complexity and Spatial Information
Michael Batty, Robin Morphet, Paolo Massuci, and Kiril Stanilov