Neurons and Cognition (q-bio.NC)
Fri, 25 Aug 2023
1.Alternating Shrinking Higher-order Interactions for Sparse Neural Population Activity
Authors:Ulises Rodríguez-Domínguez, Hideaki Shimazaki
Abstract: Neurons in living things work cooperatively and efficiently to process incoming sensory information, often exhibiting sparse and widespread population activity involving structured higher-order interactions. While there are statistical models based on continuous probability distributions for neurons' sparse firing rates, how the spiking activities of a large number of interacting neurons result in the sparse and widespread population activity remains unknown. Here, for homogeneous (0,1) binary neurons, we provide sufficient conditions under which their spike-count population distribution converges to a sparse widespread distribution of the population spike rate in an infinitely large population of neurons. Following the conditions, we propose new models belonging to an exponential family distribution in which the sign and magnitude of neurons' higher-order interactions alternate and shrink as the order increases. The distributions exhibit parameter-dependent sparsity on a bounded support for the population firing rate. The theory serves as a building block for developing prior distributions and neurons' non-linearity for spike-based sparse coding.
2.Robust Core-Periphery Constrained Transformer for Domain Adaptation
Authors:Xiaowei Yu, Lu Zhang, Dajiang Zhu, Tianming Liu
Abstract: Unsupervised domain adaptation (UDA) aims to learn transferable representation across domains. Recently a few UDA works have successfully applied Transformer-based methods and achieved state-of-the-art (SOTA) results. However, it remains challenging when there exists a large domain gap between the source and target domain. Inspired by humans' exceptional transferability abilities to adapt knowledge from familiar to uncharted domains, we try to apply the universally existing organizational structure in the human functional brain networks, i.e., the core-periphery principle to design the Transformer and improve its UDA performance. In this paper, we propose a novel brain-inspired robust core-periphery constrained transformer (RCCT) for unsupervised domain adaptation, which brings a large margin of performance improvement on various datasets. Specifically, in RCCT, the self-attention operation across image patches is rescheduled by an adaptively learned weighted graph with the Core-Periphery structure (CP graph), where the information communication and exchange between images patches are manipulated and controlled by the connection strength, i.e., edge weight of the learned weighted CP graph. Besides, since the data in domain adaptation tasks can be noisy, to improve the model robustness, we intentionally add perturbations to the patches in the latent space to ensure generating robust learned weighted core-periphery graphs. Extensive evaluations are conducted on several widely tested UDA benchmarks. Our proposed RCCT consistently performs best compared to existing works, including 88.3\% on Office-Home, 95.0\% on Office-31, 90.7\% on VisDA-2017, and 46.0\% on DomainNet.