|
|
Jun 18, 2025
|
|
|
|
ENEE 622 - Information Theory Credits: [3] Description: Shannon’s information measures: entropy, differential entropy, information divergence, mutual information and their basic properties. Entropy rates, asymptotic equipartition property, weak and strong typicality, joint typicality, Shannon’s source coding theorem and its converse, prefix-free and uniquely decodable source codes, Huffman and Shannon codes, universal source coding, source-coding with a fidelity criterion, the rate-distortion function and its achievability, channel capacity and its computation, Shannon’s channel coding theorem, strong coding theorem, error exponents, Fano’s inequality and the converse to the coding theorem, feedback capacity, joint source channel coding, discrete-time additive Gaussian channels, the covering lemma, continuous-time additive Gaussian channels, parallel additive Gaussian channels and waterfilling. Additional topics: narrow-band time-varying channels, fading channels, side information, wideband channels, network coding, information theory in relation to statistics and geometry. Course ID: 053937 Prerequisite: Prerequisite: Strong grasp of basic probability theory. Components: Lecture Grading Method: A-F, Audit
Add to Portfolio (opens a new window)
|
|
|