Calculate the Shannon entropy H of a given input string.
Given the discreet random variable $X$ that is a string of $N$ "symbols" (total characters) consisting of $n$ different characters (n=2 for binary), the Shannon entropy of X in bits/symbol is :
$H_2(X) = -\sum_{i=1}^n \frac{count_i}{N} \log_2 \left(\frac{count_i}{N}\right)$
where $count_i$ is the count of character $n_i$.
entropy
is a function.
testString: 'assert(typeof entropy === "function", "entropy
is a function.");'
- text: entropy("0")
should return 0
testString: 'assert.equal(entropy("0"), 0, "entropy("0")
should return 0
");'
- text: entropy("01")
should return 1
testString: 'assert.equal(entropy("01"), 1, "entropy("01")
should return 1
");'
- text: entropy("0123")
should return 2
testString: 'assert.equal(entropy("0123"), 2, "entropy("0123")
should return 2
");'
- text: entropy("01234567")
should return 3
testString: 'assert.equal(entropy("01234567"), 3, "entropy("01234567")
should return 3
");'
- text: entropy("0123456789abcdef")
should return 4
testString: 'assert.equal(entropy("0123456789abcdef"), 4, "entropy("0123456789abcdef")
should return 4
");'
- text: entropy("1223334444")
should return 1.8464393446710154
testString: 'assert.equal(entropy("1223334444"), 1.8464393446710154, "entropy("1223334444")
should return 1.8464393446710154
");'
```