Kolmogorov AI Framework | Part 1
Shannon Entropy Concept In 1948, engineer and mathematician Claude Shannon published a foundational paper for computer science, and later artificial intelligence: A Mathematical Theory of Communication. This article defines a central idea in the training of current algorithms: information entropy. $$ H = -\sum_{i = 1}^{n} p_i \log_2(p_i) $$ This formula allows us to quantify how random or organized a data source, such as a text-generating program, is. The higher the entropy, the more random the source; the lower the entropy, the more the data consists of recognizable patterns that allow us to predict the next words. ...