Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/99999/fk4n88qc0b
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Poor, Vincent H. | |
dc.contributor.author | Yagli, Semih | |
dc.contributor.other | Electrical Engineering Department | |
dc.date.accessioned | 2021-10-04T13:24:45Z | - |
dc.date.available | 2021-10-04T13:24:45Z | - |
dc.date.created | 2021-01-01 | |
dc.date.issued | 2021 | |
dc.identifier.uri | http://arks.princeton.edu/ark:/99999/fk4n88qc0b | - |
dc.description.abstract | We study three distinct and important problems in the intersection of information and estimation theory. The first problem we tackle is known in the literature as the amplitude constrained Gaussian channel. It is well known that for a peak-power constrained Gaussian channel the capacity-achieving input distribution is discrete with finitely many mass points. However, an unfortunate shortcoming of the prior proof technique, a bound on the number of mass points in the capacity-achieving input distribution was not accessible previously. Here, we provide an alternative proof of the finiteness of the number of mass points of the capacity-achieving input distribution while producing the first firm upper bound on the number of mass points. We also generalize this novel proof technique to multi-dimensional settings as well as to the amplitude constrained Gaussian mean estimation problem. The second problem we resolve is in the realm of channel resolvability and error exponents. Using simple but non-trivial techniques, we establish the exact exponents for the soft-covering phenomenon of a memoryless channel under the total variation metric when random i.i.d. and random constant-composition channel codes are used. Moreover, we provide alternative representations of these exponents in terms of $\alpha$-mutual information, relating the two seemingly unrelated mathematical concepts in a very pleasing manner. Lastly, we turn our attention to universal lossless compression. We characterize the redundancy for universal lossless compression of discrete memoryless sources in Campbell's setting as a minimax Rényi divergence, which we show to be equal to the maximal $\alpha$-mutual information via a generalized redundancy-capacity theorem. We place particular emphasis on the analysis of the asymptotics of minimax Rényi divergence, which we determine up to a term vanishing in blocklength, building a bridge between the asymptotics of minimax regret and minimax redundancy. | |
dc.format.mimetype | application/pdf | |
dc.language.iso | en | |
dc.publisher | Princeton, NJ : Princeton University | |
dc.relation.isformatof | The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: <a href=http://catalog.princeton.edu>catalog.princeton.edu</a> | |
dc.subject | Channel Capacity | |
dc.subject | Data Science | |
dc.subject | Error Exponents | |
dc.subject | Parameter Estimation | |
dc.subject | Private Communication | |
dc.subject | Universal Lossless Compression | |
dc.subject.classification | Engineering | |
dc.subject.classification | Computer science | |
dc.subject.classification | Communication | |
dc.title | Topics in Information and Estimation Theory: Parameter Estimation, Lossless Compression, Constrained Channels, and Error Exponents | |
dc.type | Academic dissertations (Ph.D.) | |
pu.date.classyear | 2021 | |
pu.department | Electrical Engineering | |
Appears in Collections: | Electrical Engineering |
Files in This Item:
File | Size | Format | |
---|---|---|---|
Yagli_princeton_0181D_13633.pdf | 1.47 MB | Adobe PDF | View/Download |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.