A high-throughput approach for the efficient prediction of perceived similarity of natural objects

Avatar
Poster
Voices Powered byElevenlabs logo
Connected to paperThis paper is a preprint and has not been certified by peer review

A high-throughput approach for the efficient prediction of perceived similarity of natural objects

Authors

Kaniuth, P.; Mahner, F. P.; Perkuhn, J.; Hebart, M. N.

Abstract

Perceived similarity offers a window into the mental representations underlying our ability to make sense of our visual world, yet, the collection of similarity judgments quickly becomes infeasible for larger datasets, limiting their generality. To address this challenge, here we introduce a computational approach that predicts perceived similarity from neural network activations through a set of 49 interpretable dimensions learned on 1.46 million triplet odd-one-out judgments. The approach allowed us to predict separate, independently-sampled similarity scores with an accuracy of up to 0.898. Combining this approach with human ratings of the same dimensions led only to small improvements, indicating that the neural network captured much of human knowledge in this task. Predicting the similarity of highly homogenous image classes revealed that performance critically depends on the granularity of the training data. Our approach allowed us to improve the brain-behavior correspondence in a large-scale neuroimaging dataset and visualize candidate image features humans use for making similarity judgments, thus highlighting which image parts carry behaviorally-relevant information. Together, our results demonstrate that neural networks can carry information sufficient for capturing broadly-sampled similarity scores, offering a pathway towards the automated collection of human similarity judgments for natural images.

Follow Us on

0 comments

Add comment