1 An Overview
1.1 A Framework for Digital Communications
1.1.1 Sources,Channels,and Limits to Communication
1.1.2 Operations in the Digital Transmission Pathway
1.1.3 Modulation and Coding
1.2 Historical Notes
1.3 Outline of Book
Bibliography
2 Fundamentals of Probability and Information Theory
2.1 Probability
2.1.1 Conditional Probability
2.1.2 Independence
2.2 Random Variables:Discrete and Continuous
2.2.1 Discrete Random Variables
2.2.2 Continuous Random Variables
2.2.3 Multidimensional Random Variables or Random Vectors
2.2.4 Conditional Distributions and Densities
2.2.5 Independence of Random Variables
2.2.6 Transformations of Random Variables
2.3 Expectations and Moments
2.3.1 First and Second Moments
2.3.2 Correlation and Covariance
2.3.3 Characteristic Functions
2.4 Probability Bounds and Limit Theorems
2.4.1 Bounds Based on First and Second Moments
2.4.2 Chernoff bounds
2.4.3 Sequences,Sums,and Laws of Large Numbers
2.4.4 Central Limit Theorem
2.5 Stochastic Processes
2.5.1 Wide-Sense Stationarity ,Autocorrelation Function,and Power Spectral Density
2.5.2 Stochastic Processes in Linear Systems
2.5.3 Time Averages versus Ensemble Averages
2.5.4 Karhunen-Loeve Series Representation for Random Processes
2.5.5 Markov Models
2.6Statistical Decision Theory
2.6.1 Minimum Probability of Error Policies
2.6.2 Irrelevant Data and Sufficient Statistics
2.7 Concepts of Information Theory for Discrete Alphabets
2.7.1 Entropy for Discrete Random Variables
2.7.2 Joint and Conditional Entropy
2.7.3 Mutual Information
2.7.4 Discrete Channels and Channel Capacity
2.7.5 Sequence Transmission
2.7.6 Converse to the Nosy Channel Coding Theorem
2.8 Coding of Discrete Information Sources
2.8.1 Block Source Codes
2.8.2 Block -to Variable -length Encoding
2.8.3 Extensions to Discrete Markov Sources
2.9 Information Theory for Continuous Random Variables and Processes
2.9.1 Scalar Variable Case
2.9.2 Vector Gaussian Channel Case
2.9.3 Waveform Channel Case
3 Modulation and Defection
3.1 A Transmission Model
3.1.1 Digital Modulation
3.1.2 Channel Filtering
3.1.3 Channel Gain and Fading
3.1.4 Noise Model
3.1.5 Model Limitations
3.2 Signal Spaces
3.2.1 Orthonormal Basis Sets
3.2.2 M-ary Signal constellations
3.3 Signle-symbol Detection of Known Signals in AWGN
3.3.1 Error Performance of General Binary Signals in AWGN
3.3.2 Performance Bounds for M-ary Signaling
3.3.3 Detection of M-ary Orthogonal ,Biorthogonal,and Simplex Modulation
3.3.4 Detection of M-ary Phase Shift Keying
3.3.5 M-ary Amplitude Modulation and Quadrature Amplitude Modulation
3.3.6 Multidimensional Lattice-symbol Transmission on Nonideal Channels
3.3.7 Summary of Energy and Spectrum Efficiency of Modulation Techniques
3.3.8 Extension to Single -symbol Transmission on Nonideal Channels
3.4 Noncoherent Demodulation of Carrier -modulated Signals
3.4.1 Structure of Optimal Noncoherent Demodulator
3.4.2 Performance Analysis for Noncoherent Demodulation of Binary Orthogonal Signals
3.4.3 Performance Analysis of Noncoherent Detection of M-ary Orthogognal Signals
3.5 Phase Comparison or Differentially Coherent Demodulation of PSK
3.5.1 Structure of Optimal Demodulator
3.5.2 Performance Evaluation for M-DPSK
3.6 Performance on the Slow ,Nonselective Rayleigh Fading Channel
3.6.1 Binary Signaling with Rayleigh Fading
3.6.2 M-ary Orthogonal Signaling with Noncoherent Detection
3.6.3 M-ary PSK and DPSK
3.7 Power Spectra of Digitally Modulated Signals
3.7.1 Overview on Power Spectrum and Some Cautions
3.7.2 Power Spectrum for General Memory less Modulation
3.7.3 Baseband Pluse -amplitude Signaling
3.7.4 Spectra for M-PSK and M-QAM Modulation
3.7.5 Asymptotic Behavior of Power Spectrum;Role of Dimensionality
3.7.6 Power Spectrum for Markov-input Modulation
3.8 Spread-spectrum Modulation
3.8.1 Direct Sequence Spread Spectrum
3.8.2 Frequency-hopping Spread Spectrum
4 Channel coding and lts Potential
4.1 A Taxonomy of Codes
4.2 Introduction to Block Coding and Optimal Decoding
4.3 Two-codeword Error Probability and
4.3.1 Ensemble Average Performance for Two-codeword Codes
4.3.2 Extension to Discrete-input,Continuous-Output Channels
4.3.3 Generalizations
4.4 Probability of Error with Many Codewords and the Channel Coding Theorem
4.4.1 Code Ensembles and a Simple Ensemble Bound on Preferemce
4.4.2 Generalized Upper Bound for a Specific Code with May Codewords
4.4.3 Properties of the Error Exponent and a Coding Theorem
4.4.4 Summary of coding Potential for Block Codes on DMCs
4.4.5 Remarks for Trellis Codes
4.5 Implications of R0 and C for Binary Signaling on AWGN and Channels
4.5.1 R0 and C Considerations for Binary Signaling ,AWGN Channel ,and Hard
4.4.2 Generalized Upper Bound for a Specific Code with Many Codewords
4.4.3 Properties of the Error Exponent and a Coding Theorem
4.4.4 Summary of Coding Potential for Block Codes on DMCs
4.4.5 Remarks for Trellis Codes
4.5 Implications of R0 and C for Binary Signaling on AWGN and Channels
4.5.1 R0 and C Considerations for Binary Signaling ,AWGN Channel ,and Hard Decisions
4.5.2 Binary Signaling ,Unquantized Demodulation
4.5.3 Binary Signaling with Soft -Quantized Demodulation
4.5.4 Summary for Binary Transmission ,AWGN Channels
4.5.5 R0 and C with M-ary Modulation ,AWGN Channels
4.6 Capacity and
R0 for the Rayleigh Fading Channel
4.6.1 Coding Potential for Binary Signaling on the Rayleigh Channel
4.6.2 M-ary Noncoherent Transmission on the Rayleigh Channel
4.6.3 Channel Capactiy for Bandwidth -efficient Modulation on the Rayleigh Channel
4.7 Further Studies on Coding Potential
4.7.1 Photon Counting Optical Communication
4.7.2 Block Interference Channels
5 Block Coding
5.0 The Binary Hamming Code
5.1 Algebra of finite Fields
5.1.1 Polynomials over Fields and Extension Fields
5.1.2 Computation in Finite Fields
5.1.3 Discrete Fourier Transforms over Finite Fields
5.2 Linear Block Codes
5.2.1 Structure of Linear Codes over Gf
5.2.2 Distance Preperties of Linear Codes and Error Protection Properties
5.2.3 Decoding of Linear Block Codes
5.2.4 Performance Measure for Algebraic Decoding
5.2.5 Hamming Codes over
5.2.6 Reed-Muller Codes
5.3 Bounds on Minimum Hamming Distance for Block Codes
5.3.1 Hamming Bound
5.3.2 Sing Elton Bound
5.3.3 Plotkin Bound
5.3.4 Gilbert Bound
5.3.5 Varshamov Bound
5.3.6 Asymptotic Forms of the Varshamvo-Gilbert and Hamming Bounds
5.3.7 Channel Capacity and the Coding Theorem Revisited
5.4 Cyclic Codes
5.4.1 Structure of Cyclic Codes
5.4.2 Encoding of Cyclic Codes
5.4.3 BCH Codes
5.4.4 cyclic Hamming Codes
5.4.5 Reed-Solomon Codes
5.5 Decoding of Cyclic Codes
5.5.1 General -Purpoes Decoding of Cyclic Codes over
5.5.2 Algebraic Decoding of BCH Codes and RS Codes
5.5.3 Errors-and -Erasures Decoding
5.5.4 ML and Near-ML Decoding
5.6 Modifying Block Codes
5.6.1 Extending and Puncturing
5.6.2 Expurgation and Augmentation
5.6.3 Lengthening and Shortening
5.7 Error Detection with Cyclic Codes
5.8 Layered Codes:Product Codes and Concatenated Codes
5.8.1 Product Codes
5.8.2 Concatenated Codes
5.9 Interleaving for Channels with Memory
5.9.1 Block Interleaving
5.9.2 Convolutional Interleaving
5.10 Performance Evaluation for Block Codes
5.10.1 AWGN Channel ,Hard-decision Decoding
5.10.2 Soft-Decision Decoding AWGN Channel
5.10.3 Hard-decision Decoding ,Rayleigh Channel
5.10.4 Soft-decision Decoding ,Rayleigh Channel
5.11 Power Spectrum of Conventional Block Coded Modulation
5.12 Block Coding for Band-limited Channels
5.12.1 Multilevel Coding
5.12.2 Simple LSB Coding and Hard-decision Decoding
5.12.3 Multievel Codes for Fading Channels
6 Trellis Codes
6.1 Description of Convolutional Codes
6.1.1 Binary Convoiutional Codes
6.1.2 Nonbinary Convolutional Codes
6.1.3 Parity Check Matrices
6.1.4 Inverse Circuits
6.1.5 State Diagrams and Trellises
6.2 Hamming Distance Measures for Convolutional Codes;Various Good Codes
6.2.1 Distance Definitions
6.2.2 Bounds on Free Distance
6.2.3 Optimal Free Distance Codes
6.2.4 Punctured Convolutional Codes
6.2.5 Optimal Distance Profile Codes
6.3 Maximum Likelihood Decoding of Convolutional Codes
6.3.1 Maximum Likelihood Sequence Decoding
6.3.2 Implementation Issues
6.4 Error Probability with Maximum Likelihood Decoding of Convoluional Codes
6.4.1 Performance of binary Convolutonal Codes on Nonfading Channels
6.4.2 Generalization to Bhattacharrya Expression
6.4.3 Nonbinary Convolutional Codes and Noncoherent Detection
6.4.4 Fading Channel Performance
6.5 Other Decoding Procedures:Sequential Decoding and Feedback Decoding
6.5.1 Sequential Decoding
6.5.2 Feedback Decoding
6.6 Trellis Coding with Expanded Signal Sets for Band-limited Channels
6.6.1 Set Partitioning
6.6.2 Hand Design of Codes
6.6.3 Trellis Codes for Fading Channels
6.7 Continuous-phase Modulation
6.7.1 Signal Description
6.7.2 State Representation
6.7.3 Modular Implementations
6.7.4 Description of CPM as Memoryless Modulation Preceded Coding
6.7.5 Power Topics of CPM Modulation
6.7.6 Coherent Decoding of CPM
6.7.7 Related Topics in CPM
Appendix 6A1:Numerical Evaluation of Transfer Function Bounds
Bibliography
Exercises
Index