Sep 28, 2024  
2017-2018 Graduate Catalog 
    
2017-2018 Graduate Catalog [ARCHIVED CATALOG]

Add to Portfolio (opens a new window)

ENEE 622 - Information Theory

[3]
Shannon’s information measures: entropy, differential entropy, information divergence, mutual information and their basic properties. Entropy rates, asymptotic equipartition property, weak and strong typicality, joint typicality, Shannon’s source coding theorem and its converse, prefix-free and uniquely decodable source codes, Huffman and Shannon codes, universal source coding, source-coding with a fidelity criterion, the rate-distortion function and its achievability, channel capacity and its computation, Shannon’s channel coding theorem, strong coding theorem, error exponents, Fano’s inequality and the converse to the coding theorem, feedback capacity, joint source channel coding, discrete-time additive Gaussian channels, the covering lemma, continuous-time additive Gaussian channels, parallel additive Gaussian channels and waterfilling. Additional topics: narrow-band time-varying channels, fading channels, side information, wideband channels, network coding, information theory in relation to statistics and geometry.
Prerequisite: Prerequisite: Strong grasp of basic probability theory.



Add to Portfolio (opens a new window)