Quantifying social roles in multi-animal videos using subject-aware deep-learning

Voices Powered byElevenlabs logo
Connected to paperThis paper is a preprint and has not been certified by peer review

Quantifying social roles in multi-animal videos using subject-aware deep-learning


Goss, K.; Bueno-Junior, L. S.; Stangis, K.; Ardoin, T.; Carmon, H.; Zhou, J.; Satapathy, R.; Baker, I.; Jones-Tinsley, C. E.; Lim, M. M.; Watson, B. O.; Sueur, C.; Ferrario, C. R.; Murphy, G. G.; Ye, B.; Hu, Y.


Analyzing social behaviors is critical for many fields, including neuroscience, psychology, and ecology. While computational tools have been developed to analyze videos containing animals engaging in limited social interactions under specific experimental conditions, automated identification of the social roles of freely moving individuals in a multi-animal group remains unresolved. Here we describe a deep-learning-based system, named LabGym2, for identifying and quantifying social roles in multi-animal groups. This system uses a subject-aware approach: it evaluates the behavioral state of every individual in a group of two or more animals while factoring in its social and environmental surroundings. We demonstrate the performance of subject-aware deep-learning in different species and assays, from partner preference in freely-moving insects to primate social interactions in the field. Our subject-aware deep learning approach provides a controllable, interpretable, and efficient framework to enable new experimental paradigms and systematic evaluation of interactive behavior in individuals identified within a group.

Follow Us on


Add comment