GenQuery: Supporting Expressive Visual Search with Generative Models

Voices Powered byElevenlabs logo
Connected to paper

GenQuery: Supporting Expressive Visual Search with Generative Models


Kihoon Son, DaEun Choi, Tae Soo Kim, Young-Ho Kim, Juho Kim


Designers rely on visual search to explore and develop ideas in early design stages. However, designers can struggle to identify suitable text queries to initiate a search or to discover images for similarity-based search that can adequately express their intent. We propose GenQuery, a novel system that integrates generative models into the visual search process. GenQuery can automatically elaborate on users' queries and surface concrete search directions when users only have abstract ideas. To support precise expression of search intents, the system enables users to generatively modify images and use these in similarity-based search. In a comparative user study (N=16), designers felt that they could more accurately express their intents and find more satisfactory outcomes with GenQuery compared to a tool without generative features. Furthermore, the unpredictability of generations allowed participants to uncover more diverse outcomes. By supporting both convergence and divergence, GenQuery led to a more creative experience.

Follow Us on


Add comment