Entropy

January 22, 2016

The Shannon entropy of a file is a measure of the information content in the file; higher entropy implies more information. Shannon entropy is computed as H = -1 * sum(pi * log2(pi)) where pi is the frequency of each symbol i in the input (frequency is the percentage of the total number of symbols). Shannon entropy is just one facet of the study of information theory, which is fundamental to computer science.

Your task is to compute the Shannon entropy of a file. When you are finished, you are welcome to read or run a suggested solution, or to post your own solution or discuss the exercise in the comments below.

Advertisements

Pages: 1 2