Before we delve into the world of quantum information, we will spend some time to
understand what we mean by classical information theory. While this in itself is a vast
topic, and it is impossible to do it full justice within the scope of our present course, we
will limit ourselves to some basic definition and properties of information and its
manipulation. In principle, our scope of information theory will be limited to Shannon
entropy and its quantum analogue.
i) Shannon entropy
The cornerstone of classical information theory is Shannon entropy, which captures
the disorder or uncertainty in any variable X. Alternatively, such an entropy also
captures the amount of information contained in the variable X. This complementary
view basically connects the uncertainty in not knowing a quantity with the information
gained when we know about X. In other words, Shannon entropy provides a duality of
how much we know (“information”) or how much we don’t know (“uncertainty”).
If 𝑋 takes values {𝑥! , 𝑥" , … , 𝑥# } with probabilities {𝑝! , 𝑝" , … , 𝑝# }, the Shannon entropy
is then defined as:
𝐻(𝑋) = − . 𝑝$ log " 𝑝 $ .
$
This highlights two important points about information: i) its amount is independent of
the actual values of the variable but rather their probabilities. It does not matter if we
interchange 𝑝! with 𝑝" , and ii) the log function ensures that information is additive for
two independent events occurring, i.e., 𝑓(𝑝𝑞) = 𝑓(𝑝) + 𝑓(𝑞) . The summation over the
probabilities just allows us to capture the average information over all probabilities.
We consider the log to be taken to the base 2, to accommodate classical information
in terms of bits. Also, we assume that 0 log " 0 = 0, which implies that 𝑝 = 0 events are
not included in the average.
, Operationally, Shannon entropy quantifies the resources needed to store information.
Suppose you have a source that generates a string 𝑝! , 𝑝" , … , 𝑝# , from a set of
independent, random variables 𝑃! , 𝑃" , … , 𝑃# . The question now is what is the minimum
resource (number of bits) required to store or communicate this information. The
answer is enshrined in Shannon’s noiseless coding theorem and is equal to 𝐻(𝑃$ )
number of bits per symbol, where 𝐻(𝑋) is the Shannon entropy.
Example: Let us consider a farm that stocks these food items: bread, eggs, chicken
and fish, with stock size proportional to 1/2, 1/4, 1/8, and 1/8.
Ideally, we need two bits of store this information: 00, 01, 10, 11, i.e., bread (00), eggs
(01), chicken (10), fish (11). Each message requires two bit.
But all these items do not have the same probability, so we can compress this data:
let’s say, bread (0), eggs (10), chicken (110), fish (111)
Average length: 1/2 * 1 + 1/4 * 2 + 1/8 * 3 + 1/8 * 3 = 7/4
Shannon entropy: -1/2*log(1/2) –1/4*log(1/4) – 1/8*log(1/8) – 1/8*log(1/8) = 7/4
Therefore, on average each message will require 𝑯(𝑿) = 𝟕/𝟒 < 𝟐 bits.
Suppose, you have an 𝑛-bit message: 000111…..110, where the bit 0 occurs with
probability 𝑝 and the bit 1 with probability 1 − 𝑝. Now, how many bits are required to
represent this message?
In the limit of large 𝑛, the minimum resource needed to express a message containing
𝑛 bits, is 𝑛𝐻(𝑝), which is equal to 𝑛 only for 𝑝 = 1/2.
The questions above concerns redundancy. It seeks to ask how can less resources
be used to carry a message, on average. Shannon connected these questions to the
idea of entropy, which then forms the cornerstone of information theory. All of major
work related to classical information theory is about manipulation of Shannon entropy.
Exercise: Show that you cannot do better than this without losing distinguishability?
Les avantages d'acheter des résumés chez Stuvia:
Qualité garantie par les avis des clients
Les clients de Stuvia ont évalués plus de 700 000 résumés. C'est comme ça que vous savez que vous achetez les meilleurs documents.
L’achat facile et rapide
Vous pouvez payer rapidement avec iDeal, carte de crédit ou Stuvia-crédit pour les résumés. Il n'y a pas d'adhésion nécessaire.
Focus sur l’essentiel
Vos camarades écrivent eux-mêmes les notes d’étude, c’est pourquoi les documents sont toujours fiables et à jour. Cela garantit que vous arrivez rapidement au coeur du matériel.
Foire aux questions
Qu'est-ce que j'obtiens en achetant ce document ?
Vous obtenez un PDF, disponible immédiatement après votre achat. Le document acheté est accessible à tout moment, n'importe où et indéfiniment via votre profil.
Garantie de remboursement : comment ça marche ?
Notre garantie de satisfaction garantit que vous trouverez toujours un document d'étude qui vous convient. Vous remplissez un formulaire et notre équipe du service client s'occupe du reste.
Auprès de qui est-ce que j'achète ce résumé ?
Stuvia est une place de marché. Alors, vous n'achetez donc pas ce document chez nous, mais auprès du vendeur joshisfineyoohoo. Stuvia facilite les paiements au vendeur.
Est-ce que j'aurai un abonnement?
Non, vous n'achetez ce résumé que pour €13,07. Vous n'êtes lié à rien après votre achat.