Slepian et al., 2003 - Google Patents
Noiseless coding of correlated information sourcesSlepian et al., 2003
View PDF- Document ID
- 10238914131422448387
- Author
- Slepian D
- Wolf J
- Publication year
- Publication venue
- IEEE Transactions on information Theory
External Links
Snippet
Correlated information sequences ⋯, X_-1, X_0, X_1, ⋯ and ⋯, Y_-1, Y_0, Y_1, ⋯ are generated by repeated independent drawings of a pair of discrete random variables X, Y from a given bivariate distribution P_ XY (x, y). We determine the minimum number of bits …
Classifications
-
- H—ELECTRICITY
- H03—BASIC ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M7/00—Conversion of a code where information is represented by a given sequence or number of digits to a code where the same information or similar information or a subset of information is represented by a different sequence or number of digits
- H03M7/30—Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
- H03M7/40—Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L25/00—Baseband systems
- H04L25/38—Synchronous or start-stop systems, e.g. for Baudot code
- H04L25/40—Transmitting circuits; Receiving circuits
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Slepian et al. | Noiseless coding of correlated information sources | |
| Shannon | A mathematical theory of communication | |
| Leung-Yan-Cheong et al. | The Gaussian wire-tap channel | |
| Gray | Source coding theory | |
| Verdu | Fifty years of Shannon theory | |
| Shannon | A mathematical theory of communication | |
| Pierce | The early days of information theory | |
| Gastpar | The Wyner-Ziv problem with multiple sources | |
| Wozencraft et al. | Modulation and demodulation for probabilistic coding | |
| Gray et al. | Source coding theorems without the ergodic assumption | |
| Orlitsky et al. | Speaking of infinity [iid strings] | |
| Berger | Information rates of Wiener processes | |
| Gray | Time-invariant trellis encoding of ergodic discrete-time sources with a fidelity criterion | |
| Shields | The interactions between ergodic theory and information theory | |
| Gray et al. | Nonblock source coding with a fidelity criterion | |
| Telatar | Multi-access communications with decision feedback decoding | |
| Wu et al. | New upper bounds on the capacity of primitive diamond relay channels | |
| Bondaschi et al. | A revisitation of low-rate bounds on the reliability function of discrete memoryless channels for list decoding | |
| Gibson | Information theory and rate distortion theory for communications and compression | |
| Savari | Variable-to-fixed length codes and the conservation of entropy | |
| Kazakos | Robust noiseless source coding through a game theoretical approach | |
| Baer | Optimal prefix codes for infinite alphabets with nonlinear costs | |
| Farkas et al. | Controlled asynchronism improves error exponent | |
| Moharir | Generalized PN sequences (corresp.) | |
| Liu et al. | A Rate-Distortion Analysis for Composite Sources Under Subsource-Dependent Fidelity Criteria |