Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01rr171x33m
Title: Lossy data compression: nonasymptotic fundamental limits
Authors: Kostina, Victoria
Advisors: Verdu, Sergio
Contributors: Electrical Engineering Department
Keywords: finite blocklength regime
fundamental limits
Gaussian approximation
information theory
joint source-channel coding
lossy compression
Subjects: Electrical engineering
Issue Date: 2013
Publisher: Princeton, NJ : Princeton University
Abstract: The basic tradeoff in lossy compression is that between the compression ratio (rate) and the fidelity of reproduction of the object that is compressed. Traditional (asymptotic) information theory seeks to describe the optimum tradeoff between rate and fidelity achievable in the limit of infinite length of the source block to be compressed. A perennial question in information theory is how relevant the asymptotic fundamental limits are when the communication system is forced to operate at a given fixed blocklength. The finite blocklength (delay) constraint is inherent to all communication scenarios. In fact, in many systems of current interest, such as real-time multimedia communication, delays are strictly constrained, while in packetized data communication, packets are frequently on the order of 1000 bits. Motivated by critical practical interest in non-asymptotic information-theoretic limits, this thesis studies the optimum rate-fidelity tradeoffs in lossy source coding and joint source-channel coding at a given fixed blocklength. While computable formulas for the asymptotic fundamental limits are available for a wide class of channels and sources, the luxury of being able to compute exactly (in polynomial time) the non-asymptotic fundamental limit of interest is rarely affordable. One can at most hope to obtain bounds and approximations to the information-theoretic non-asymptotic fundamental limits. The main findings of this thesis include tight bounds to the non-asymptotic fundamental limits in lossy data compression and transmission, valid for general sources without any assumptions on ergodicity or memorylessness. Moreover, in the stationary memoryless case this thesis shows a simple formula approximating the nonasymptotic fundamental limits which involves only two parameters of the source. This thesis considers scenarios where one must put aside traditional asymptotic thinking. A striking observation made by Shannon states that separate design of source and channel codes achieves the asymptotic fundamental limit of joint source-channel coding. At finite blocklengths, however, joint source-channel code design brings considerable performance advantage over a separate one. Furthermore, in some cases uncoded transmission is the best known strategy in the non-asymptotic regime, even if it is suboptimal asymptotically. This thesis also treats the lossy compression problem in which the compressor observes the source through a noisy channel, which is asymptotically equivalent to a certain conventional lossy source coding problem but whose nonasymptotic fidelity-rate tradeoff is quite different.
URI: http://arks.princeton.edu/ark:/88435/dsp01rr171x33m
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Kostina_princeton_0181D_10756.pdf2.35 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.