Before we delve into the world of quantum information, we will spend some time to
understand what we mean by classical information theory. While this in itself is a vast
topic, and it is impossible to do it full justice within the scope of our present course, we
will limit ourselves to some basic definition and properties of information and its
manipulation. In principle, our scope of information theory will be limited to Shannon
entropy and its quantum analogue.
i) Shannon entropy
The cornerstone of classical information theory is Shannon entropy, which captures
the disorder or uncertainty in any variable X. Alternatively, such an entropy also
captures the amount of information contained in the variable X. This complementary
view basically connects the uncertainty in not knowing a quantity with the information
gained when we know about X. In other words, Shannon entropy provides a duality of
how much we know (“information”) or how much we don’t know (“uncertainty”).
If 𝑋 takes values {𝑥! , 𝑥" , … , 𝑥# } with probabilities {𝑝! , 𝑝" , … , 𝑝# }, the Shannon entropy
is then defined as:
𝐻(𝑋) = − . 𝑝$ log " 𝑝 $ .
$
This highlights two important points about information: i) its amount is independent of
the actual values of the variable but rather their probabilities. It does not matter if we
interchange 𝑝! with 𝑝" , and ii) the log function ensures that information is additive for
two independent events occurring, i.e., 𝑓(𝑝𝑞) = 𝑓(𝑝) + 𝑓(𝑞) . The summation over the
probabilities just allows us to capture the average information over all probabilities.
We consider the log to be taken to the base 2, to accommodate classical information
in terms of bits. Also, we assume that 0 log " 0 = 0, which implies that 𝑝 = 0 events are
not included in the average.
, Operationally, Shannon entropy quantifies the resources needed to store information.
Suppose you have a source that generates a string 𝑝! , 𝑝" , … , 𝑝# , from a set of
independent, random variables 𝑃! , 𝑃" , … , 𝑃# . The question now is what is the minimum
resource (number of bits) required to store or communicate this information. The
answer is enshrined in Shannon’s noiseless coding theorem and is equal to 𝐻(𝑃$ )
number of bits per symbol, where 𝐻(𝑋) is the Shannon entropy.
Example: Let us consider a farm that stocks these food items: bread, eggs, chicken
and fish, with stock size proportional to 1/2, 1/4, 1/8, and 1/8.
Ideally, we need two bits of store this information: 00, 01, 10, 11, i.e., bread (00), eggs
(01), chicken (10), fish (11). Each message requires two bit.
But all these items do not have the same probability, so we can compress this data:
let’s say, bread (0), eggs (10), chicken (110), fish (111)
Average length: 1/2 * 1 + 1/4 * 2 + 1/8 * 3 + 1/8 * 3 = 7/4
Shannon entropy: -1/2*log(1/2) –1/4*log(1/4) – 1/8*log(1/8) – 1/8*log(1/8) = 7/4
Therefore, on average each message will require 𝑯(𝑿) = 𝟕/𝟒 < 𝟐 bits.
Suppose, you have an 𝑛-bit message: 000111…..110, where the bit 0 occurs with
probability 𝑝 and the bit 1 with probability 1 − 𝑝. Now, how many bits are required to
represent this message?
In the limit of large 𝑛, the minimum resource needed to express a message containing
𝑛 bits, is 𝑛𝐻(𝑝), which is equal to 𝑛 only for 𝑝 = 1/2.
The questions above concerns redundancy. It seeks to ask how can less resources
be used to carry a message, on average. Shannon connected these questions to the
idea of entropy, which then forms the cornerstone of information theory. All of major
work related to classical information theory is about manipulation of Shannon entropy.
Exercise: Show that you cannot do better than this without losing distinguishability?
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller joshisfineyoohoo. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $13.49. You're not tied to anything after your purchase.