Sensory-memory interactions via modular structure explain errors in visual working memory

Avatar
Poster
Voice is AI-generated
Connected to paperThis paper is a preprint and has not been certified by peer review

Sensory-memory interactions via modular structure explain errors in visual working memory

Authors

Yang, J.; Zhang, H.; Lim, S.

Abstract

Errors in stimulus estimation reveal how stimulus representation changes during cognitive processes. Repulsive bias and minimum variance observed near cardinal axes are well-known error patterns typically associated with visual orientation perception. Recent experiments, however, suggest that these errors continuously evolve during working memory. Here, we demonstrate that a network can produce correct error patterns in delayed estimation when fulfilling two distinct functions: efficient sensory encoding and memory maintenance. No single-module network can satisfy these demands simultaneously. Instead, network interaction between sensory-memory modules is essential. The sensory module exhibits heterogeneous tuning with strong inhibitory modulation that reflects natural orientation statistics. While the memory module alone supports homogeneous memory representation via continuous attractor dynamics, the fully connected network forms discrete attractors but with moderate drift speed and nonuniform noise for accurate variance patterns. Together, our work underscores the significance of sensory-memory interaction in continuously shaping stimulus representation during working memory.

Follow Us on

0 comments

Add comment