Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp016t053k12q
Title: Topics in Information and Estimation Theory: Parameter Estimation, Lossless Compression, Constrained Channels, and Error Exponents
Authors: Yagli, Semih
Advisors: Poor, Vincent H.
Contributors: Electrical Engineering Department
Keywords: Channel Capacity
Data Science
Error Exponents
Parameter Estimation
Private Communication
Universal Lossless Compression
Subjects: Engineering
Computer science
Communication
Issue Date: 2021
Publisher: Princeton, NJ : Princeton University
Abstract: We study three distinct and important problems in the intersection of information and estimation theory. The first problem we tackle is known in the literature as the amplitude constrained Gaussian channel. It is well known that for a peak-power constrained Gaussian channel the capacity-achieving input distribution is discrete with finitely many mass points. However, an unfortunate shortcoming of the prior proof technique, a bound on the number of mass points in the capacity-achieving input distribution was not accessible previously. Here, we provide an alternative proof of the finiteness of the number of mass points of the capacity-achieving input distribution while producing the first firm upper bound on the number of mass points. We also generalize this novel proof technique to multi-dimensional settings as well as to the amplitude constrained Gaussian mean estimation problem. The second problem we resolve is in the realm of channel resolvability and error exponents. Using simple but non-trivial techniques, we establish the exact exponents for the soft-covering phenomenon of a memoryless channel under the total variation metric when random i.i.d. and random constant-composition channel codes are used. Moreover, we provide alternative representations of these exponents in terms of $\alpha$-mutual information, relating the two seemingly unrelated mathematical concepts in a very pleasing manner. Lastly, we turn our attention to universal lossless compression. We characterize the redundancy for universal lossless compression of discrete memoryless sources in Campbell's setting as a minimax Rényi divergence, which we show to be equal to the maximal $\alpha$-mutual information via a generalized redundancy-capacity theorem. We place particular emphasis on the analysis of the asymptotics of minimax Rényi divergence, which we determine up to a term vanishing in blocklength, building a bridge between the asymptotics of minimax regret and minimax redundancy.
URI: http://arks.princeton.edu/ark:/88435/dsp016t053k12q
Alternate format: The Mudd Manuscript Library retains one bound copy of each dissertation. Search for these copies in the library's main catalog: catalog.princeton.edu
Type of Material: Academic dissertations (Ph.D.)
Language: en
Appears in Collections:Electrical Engineering

Files in This Item:
File Description SizeFormat 
Yagli_princeton_0181D_13633.pdf1.47 MBAdobe PDFView/Download


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.