Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/99999/fk4n88qc0b
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorPoor, Vincent H.
dc.contributor.authorYagli, Semih
dc.contributor.otherElectrical Engineering Department
dc.date.accessioned2021-10-04T13:24:45Z-
dc.date.available2021-10-04T13:24:45Z-
dc.date.created2021-01-01
dc.date.issued2021
dc.identifier.urihttp://arks.princeton.edu/ark:/99999/fk4n88qc0b-
dc.description.abstractWe study three distinct and important problems in the intersection of information and estimation theory. The first problem we tackle is known in the literature as the amplitude constrained Gaussian channel. It is well known that for a peak-power constrained Gaussian channel the capacity-achieving input distribution is discrete with finitely many mass points. However, an unfortunate shortcoming of the prior proof technique, a bound on the number of mass points in the capacity-achieving input distribution was not accessible previously. Here, we provide an alternative proof of the finiteness of the number of mass points of the capacity-achieving input distribution while producing the first firm upper bound on the number of mass points. We also generalize this novel proof technique to multi-dimensional settings as well as to the amplitude constrained Gaussian mean estimation problem. The second problem we resolve is in the realm of channel resolvability and error exponents. Using simple but non-trivial techniques, we establish the exact exponents for the soft-covering phenomenon of a memoryless channel under the total variation metric when random i.i.d. and random constant-composition channel codes are used. Moreover, we provide alternative representations of these exponents in terms of $\alpha$-mutual information, relating the two seemingly unrelated mathematical concepts in a very pleasing manner. Lastly, we turn our attention to universal lossless compression. We characterize the redundancy for universal lossless compression of discrete memoryless sources in Campbell's setting as a minimax Rényi divergence, which we show to be equal to the maximal $\alpha$-mutual information via a generalized redundancy-capacity theorem. We place particular emphasis on the analysis of the asymptotics of minimax Rényi divergence, which we determine up to a term vanishing in blocklength, building a bridge between the asymptotics of minimax regret and minimax redundancy.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.publisherPrinceton, NJ : Princeton University
dc.relation.isformatofThe Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu>catalog.princeton.edu</a>
dc.subjectChannel Capacity
dc.subjectData Science
dc.subjectError Exponents
dc.subjectParameter Estimation
dc.subjectPrivate Communication
dc.subjectUniversal Lossless Compression
dc.subject.classificationEngineering
dc.subject.classificationComputer science
dc.subject.classificationCommunication
dc.titleTopics in Information and Estimation Theory: Parameter Estimation, Lossless Compression, Constrained Channels, and Error Exponents
dc.typeAcademic dissertations (Ph.D.)
pu.date.classyear2021
pu.departmentElectrical Engineering
Appears in Collections:Electrical Engineering

Files in This Item:
File SizeFormat 
Yagli_princeton_0181D_13633.pdf1.47 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.