Invariant Representations with Stochastically Quantized Neural Networks

Avatar
Poster
Voices Powered byElevenlabs logo
Connected to paperThis paper is a preprint and has not been certified by peer review

Invariant Representations with Stochastically Quantized Neural Networks

Authors

Mattia Cerrato, Marius Köppel, Roberto Esposito, Stefan Kramer

Abstract

Representation learning algorithms offer the opportunity to learn invariant representations of the input data with regard to nuisance factors. Many authors have leveraged such strategies to learn fair representations, i.e., vectors where information about sensitive attributes is removed. These methods are attractive as they may be interpreted as minimizing the mutual information between a neural layer's activations and a sensitive attribute. However, the theoretical grounding of such methods relies either on the computation of infinitely accurate adversaries or on minimizing a variational upper bound of a mutual information estimate. In this paper, we propose a methodology for direct computation of the mutual information between a neural layer and a sensitive attribute. We employ stochastically-activated binary neural networks, which lets us treat neurons as random variables. We are then able to compute (not bound) the mutual information between a layer and a sensitive attribute and use this information as a regularization factor during gradient descent. We show that this method compares favorably with the state of the art in fair representation learning and that the learned representations display a higher level of invariance compared to full-precision neural networks.

Follow Us on

2 comments

Avatar
scicastboard

Dear Dr. Köppel -- Thank you for sharing your interest work. Quick question on your paper from our Board (by a non-expert in ML, but an expert in quantum physics): what does the word "quantized" actually mean in this context; does it just refer to just reducing the precision of the network  (some sort discretization of neuron's input/output) or does it actually have some relation to concepts in quantum physics? We would be very curious to know, as a large part of our audience are experts in quantum physics. Thank you,
ScienceCast Board

Avatar
mkoeppel

Hi Scicastboard,

the word "quantized" has to be understood as reducing precision of the network. So it follows the definition of quantization in signal processing.

However, as we use stochastic quantized neurons one can certainly draw some relation to concepts in quantum physics. In the paper we did not further explore this direction since we only focused on the signal processing definition.

Add comment