For a long time researchers from all disciplines have avoided the use of universal mathematical measures of information theory (beyond the traditional computable, but limited, Shannon information entropy), measures such as Kolmogorov-Chaitin complexity, Solomonoff-Levin universal induction or Bennett's logical depth, as well as other related measures, citing the fact that they are uncomputable.
These measures are, however, upper or lower semi-computable and are therefore approachable from below or above. For example, lossless compression algorithms can approximate Kolmogorov-Chaitin complexity (a compressed string is a sufficient test of non-randomness) and applications have proven to be successful in many areas. Nevertheless, compression algorithms fail to compress short strings and do not represent an option for approximating their Kolmogorov complexity. This online calculator provides a means for approximating the complexity of binary short strings for which no other method has existed until now by taking advantage of the formal connections among these measures and putting together several concepts and results from theoretical computer science.
Via Complexity Digest