chore(i8n,learn): processed translations

This commit is contained in:
Crowdin Bot
2021-02-06 04:42:36 +00:00
committed by Mrugesh Mohapatra
parent 15047f2d90
commit e5c44a3ae5
3274 changed files with 172122 additions and 14164 deletions

View File

@ -1,62 +1,60 @@
---
id: 599d15309e88c813a40baf58
title:
title: Entropy
challengeType: 5
videoUrl: ''
forumTopicId: 302254
dashedName: entropy
---
# --description--
任务:
Calculate the Shannon entropy H of a given input string.
计算给定输入字符串的香农熵H.
Given the discreet random variable $X$ that is a string of $N$ "symbols" (total characters) consisting of $n$ different characters (n=2 for binary), the Shannon entropy of X in bits/symbol is:
给定谨慎的随机变量$ X $,它是$ N $“符号”(总字符)的字符串,由$ n $个不同的字符组成对于二进制n = 2位/符号中X的香农熵是
$H_2(X) = -\\sum\_{i=1}^n \\frac{count_i}{N} \\log_2 \\left(\\frac{count_i}{N}\\right)$
$ H*2X= - \\ sum* {i = 1} ^ n \\ frac {count_i} {N} \\ log_2 \\ left\\ frac {count_i} {N} \\ right$
其中$ count_i $是字符$ n_i $的计数。
where $count_i$ is the count of character $n_i$.
# --hints--
`entropy`是一种功能。
`entropy` should be a function.
```js
assert(typeof entropy === 'function');
```
`entropy("0")`应该返回`0`
`entropy("0")` should return `0`
```js
assert.equal(entropy('0'), 0);
```
`entropy("01")`应该返回`1`
`entropy("01")` should return `1`
```js
assert.equal(entropy('01'), 1);
```
`entropy("0123")`应该返回`2`
`entropy("0123")` should return `2`
```js
assert.equal(entropy('0123'), 2);
```
`entropy("01234567")`应该返回`3`
`entropy("01234567")` should return `3`
```js
assert.equal(entropy('01234567'), 3);
```
`entropy("0123456789abcdef")`应返回`4`
`entropy("0123456789abcdef")` should return `4`
```js
assert.equal(entropy('0123456789abcdef'), 4);
```
`entropy("1223334444")`应返回`1.8464393446710154`
`entropy("1223334444")` should return `1.8464393446710154`
```js
assert.equal(entropy('1223334444'), 1.8464393446710154);