TEST BANK FOR Information, Theory, Coding and Cryptography 3rd Edition By Ranjan Bose (Solution Manual)-Converted
SOLUTIONS FOR CHAPTER 1 Q.1.1 DMS with source probabilities : {0.30 0.25 0.20 0.15 0.10} Entropy H(X) = Σi i i p p log 1 = 0.30 log 1/0.30 + 0.25 log 1/0.25 + …………… = 2.228 bits Q.1.2 Define D(p ⎜⎜q) = Σi i i i p q p log (1) pi, qi – probability distributions of discrete source X. D(p ⎜⎜q) = Σi i i i p q p log ≤ Σ ⎟ ⎟⎠ ⎞ ⎜ ⎜⎝ ⎛ − i i i i p q P 1 [using identity ln x ≤ x – 1] = 0 ) ( = − Σi i i q p ∴ D(p ⎜⎜q) ≥ 0 Put qi = 1/n in (1) where n = cardinality of the distance source. D(p ⎜⎜q) = Σ + Σ i i i i i p log p p log n = H X n p p n H X n i i i ( ) log log log ( ) log 0 − ≤ Σ + = − + ≥ H(X) = log n for uniform probability distribution. Hence proved that entropy of a discrete source is maximum when output symbols are equally probable. The quantity D(p ⎜⎜q) is called the Kullback-Leibler Distance. Q. 1.3 The plots are given below:
Written for
Document information
- Uploaded on
- November 14, 2021
- Number of pages
- 62
- Written in
- 2021/2022
- Type
- Exam (elaborations)
- Contains
- Unknown
Subjects
- test bank for information
- theory
-
coding and cryptography 3rd edition by ranjan bose solution manual
-
exam elaborations