27
126

Gaussian Signalling for Covert Communications

Abstract

In this work, we examine the optimality of Gaussian signalling for covert communications with an upper bound on D(p1p0)\mathcal{D}(p_{_1}||p_{_0}) or D(p0p1)\mathcal{D}(p_{_0}||p_{_1}) as the covertness constraint, where D(p1p0)\mathcal{D}(p_{_1}||p_{_0}) and D(p0p1)\mathcal{D}(p_{_0}||p_{_1}) are different due to the asymmetry of Kullback-Leibler divergence, p0(y)p_{_0}(y) and p1(y)p_{_1}(y) are the likelihood functions of the observation y{y} at the warden under the null hypothesis (no covert transmission) and alternative hypothesis (a covert transmission occurs), respectively. Considering additive white Gaussian noise at both the receiver and the warden, we prove that Gaussian signalling is optimal in terms of maximizing the mutual information of transmitted and received signals for covert communications with an upper bound on D(p1p0)\mathcal{D}(p_{_1}||p_{_0}) as the constraint. More interestingly, we also prove that Gaussian signalling is not optimal for covert communications with an upper bound on D(p0p1)\mathcal{D}(p_{_0}||p_{_1}) as the constraint, for which as we explicitly show skew-normal signalling can outperform Gaussian signalling in terms of achieving higher mutual information. Finally, we prove that, for Gaussian signalling, an upper bound on D(p1p0)\mathcal{D}(p_{_1}||p_{_0}) is a tighter covertness constraint in terms of leading to lower mutual information than the same upper bound on D(p0p1)\mathcal{D}(p_{_0}||p_{_1}), by proving D(p0p1)D(p1p0)\mathcal{D}(p_{_0}||p_{_1}) \leq \mathcal{D}(p_{_1}||p_{_0}).

View on arXiv
Comments on this paper