arXiv daily

Information Theory (cs.IT)

Mon, 24 Apr 2023

Other arXiv digests in this category:Thu, 14 Sep 2023; Wed, 13 Sep 2023; Tue, 12 Sep 2023; Mon, 11 Sep 2023; Fri, 08 Sep 2023; Tue, 05 Sep 2023; Fri, 01 Sep 2023; Thu, 31 Aug 2023; Wed, 30 Aug 2023; Tue, 29 Aug 2023; Mon, 28 Aug 2023; Fri, 25 Aug 2023; Thu, 24 Aug 2023; Wed, 23 Aug 2023; Tue, 22 Aug 2023; Mon, 21 Aug 2023; Fri, 18 Aug 2023; Thu, 17 Aug 2023; Wed, 16 Aug 2023; Tue, 15 Aug 2023; Mon, 14 Aug 2023; Fri, 11 Aug 2023; Thu, 10 Aug 2023; Wed, 09 Aug 2023; Tue, 08 Aug 2023; Mon, 07 Aug 2023; Fri, 04 Aug 2023; Thu, 03 Aug 2023; Wed, 02 Aug 2023; Tue, 01 Aug 2023; Mon, 31 Jul 2023; Fri, 28 Jul 2023; Thu, 27 Jul 2023; Wed, 26 Jul 2023; Tue, 25 Jul 2023; Mon, 24 Jul 2023; Fri, 21 Jul 2023; Thu, 20 Jul 2023; Wed, 19 Jul 2023; Tue, 18 Jul 2023; Mon, 17 Jul 2023; Fri, 14 Jul 2023; Thu, 13 Jul 2023; Wed, 12 Jul 2023; Tue, 11 Jul 2023; Mon, 10 Jul 2023; Fri, 07 Jul 2023; Thu, 06 Jul 2023; Wed, 05 Jul 2023; Tue, 04 Jul 2023; Mon, 03 Jul 2023; Fri, 30 Jun 2023; Thu, 29 Jun 2023; Wed, 28 Jun 2023; Tue, 27 Jun 2023; Mon, 26 Jun 2023; Fri, 23 Jun 2023; Thu, 22 Jun 2023; Wed, 21 Jun 2023; Tue, 20 Jun 2023; Fri, 16 Jun 2023; Thu, 15 Jun 2023; Tue, 13 Jun 2023; Mon, 12 Jun 2023; Fri, 09 Jun 2023; Thu, 08 Jun 2023; Wed, 07 Jun 2023; Tue, 06 Jun 2023; Mon, 05 Jun 2023; Fri, 02 Jun 2023; Thu, 01 Jun 2023; Wed, 31 May 2023; Tue, 30 May 2023; Mon, 29 May 2023; Fri, 26 May 2023; Thu, 25 May 2023; Wed, 24 May 2023; Tue, 23 May 2023; Mon, 22 May 2023; Fri, 19 May 2023; Thu, 18 May 2023; Wed, 17 May 2023; Tue, 16 May 2023; Mon, 15 May 2023; Fri, 12 May 2023; Thu, 11 May 2023; Wed, 10 May 2023; Tue, 09 May 2023; Mon, 08 May 2023; Fri, 05 May 2023; Thu, 04 May 2023; Wed, 03 May 2023; Tue, 02 May 2023; Mon, 01 May 2023; Fri, 28 Apr 2023; Thu, 27 Apr 2023; Wed, 26 Apr 2023; Tue, 25 Apr 2023; Fri, 21 Apr 2023; Thu, 20 Apr 2023; Wed, 19 Apr 2023; Tue, 18 Apr 2023; Mon, 17 Apr 2023; Fri, 14 Apr 2023; Thu, 13 Apr 2023; Wed, 12 Apr 2023; Mon, 10 Apr 2023
1.Sum-rank metric codes

Authors:Elisa Gorla, Umberto Martínez-Peñas, Flavio Salizzoni

Abstract: Sum-rank metric codes are a natural extension of both linear block codes and rank-metric codes. They have several applications in information theory, including multishot network coding and distributed storage systems. The aim of this chapter is to present the mathematical theory of sum-rank metric codes, paying special attention to the $\mathbb{F}_q$-linear case in which different sizes of matrices are allowed. We provide a comprehensive overview of the main results in the area. In particular, we discuss invariants, optimal anticodes, and MSRD codes. In the last section, we concentrate on $\mathbb{F}_{q^m}$-linear codes.

2.Compressed sensing with l0-norm: statistical physics analysis and algorithms for signal recovery

Authors:D. Barbier, C Lucibello, L. Saglietti, F. Krzakala, L. Zdeborova

Abstract: Noiseless compressive sensing is a protocol that enables undersampling and later recovery of a signal without loss of information. This compression is possible because the signal is usually sufficiently sparse in a given basis. Currently, the algorithm offering the best tradeoff between compression rate, robustness, and speed for compressive sensing is the LASSO (l1-norm bias) algorithm. However, many studies have pointed out the possibility that the implementation of lp-norms biases, with p smaller than one, could give better performance while sacrificing convexity. In this work, we focus specifically on the extreme case of the l0-based reconstruction, a task that is complicated by the discontinuity of the loss. In the first part of the paper, we describe via statistical physics methods, and in particular the replica method, how the solutions to this optimization problem are arranged in a clustered structure. We observe two distinct regimes: one at low compression rate where the signal can be recovered exactly, and one at high compression rate where the signal cannot be recovered accurately. In the second part, we present two message-passing algorithms based on our first results for the l0-norm optimization problem. The proposed algorithms are able to recover the signal at compression rates higher than the ones achieved by LASSO while being computationally efficient.

3.How Costly Was That (In)Decision?

Authors:Peng Zou, Ali Maatouk, Jin Zhang, Suresh Subramaniam

Abstract: In this paper, we introduce a new metric, named Penalty upon Decision (PuD), for measuring the impact of communication delays and state changes at the source on a remote decision maker. Specifically, the metric quantifies the performance degradation at the decision maker's side due to delayed, erroneous, and (possibly) missed decisions. We clarify the rationale for the metric and derive closed-form expressions for its average in M/GI/1 and M/GI/1/1 with blocking settings. Numerical results are then presented to support our expressions and to compare the infinite and zero buffer regimes. Interestingly, comparing these two settings sheds light on a buffer length design challenge that is essential to minimize the average PuD.

4.Rectangular Rotational Invariant Estimator for General Additive Noise Matrices

Authors:Farzad Pourkamali, Nicolas Macris

Abstract: We propose a rectangular rotational invariant estimator to recover a real matrix from noisy matrix observations coming from an arbitrary additive rotational invariant perturbation, in the large dimension limit. Using the Bayes-optimality of this estimator, we derive the asymptotic minimum mean squared error (MMSE). For the particular case of Gaussian noise, we find an explicit expression for the MMSE in terms of the limiting singular value distribution of the observation matrix. Moreover, we prove a formula linking the asymptotic mutual information and the limit of log-spherical integral of rectangular matrices. We also provide numerical checks for our results, which match our theoretical predictions and known Bayesian inference results.

5.Inference in Linear Observations with Multiple Signal Sources: Analysis of Approximate Message Passing and Applications to Unsourced Random Access in Cell-Free Systems

Authors:Burak Çakmak, Eleni Gkiouzepi, Manfred Opper, Giuseppe Caire

Abstract: Here we consider a problem of multiple measurement vector (MMV) compressed sensing with multiple signal sources. The observation model is motivated by the application of {\em unsourced random access} in wireless cell-free MIMO (multiple-input-multiple-output) networks. We present a novel (and rigorous) high-dimensional analysis of the AMP (approximate message passing) algorithm devised for the model. As the system dimensions in the order, say $\mathcal O(L)$, tend to infinity, we show that the empirical dynamical order parameters -- describing the dynamics of the AMP -- converge to deterministic limits (described by a state-evolution equation) with the convergence rate $\mathcal O(L^{-\frac 1 2})$. Furthermore, we have shown the asymptotic consistency of the AMP analysis with the replica-symmetric calculation of the static problem. In addition, we provide some interesting aspects on the unsourced random access (or initial access) for cell-free systems, which is the application motivating the algorithm.