CRCNS: Neural and computational mechanisms underlying natural viewing behavior
openNEI - National Eye Institute
Primates—including humans and macaques—make rapid, instinctive eye movements to explore the
visual world, prioritize information, and guide future actions. This preferential viewing behavior is essential
for perception, social interaction, and decision-making, yet the underlying neural and computational
mechanisms remain poorly understood. This is because as a natural behavior, preferential viewing
comprise multiple concepts such as saliency, recognition, memory, and motor planning, which are often
studied separately. Here, we propose to study preferential viewing as an integrated behavior using
behavioral assays in macaques, large-scale neurophysiology, closed-loop image optimization with
generative networks, and computational modeling. Our central hypothesis is that visual preferences in
primates are shaped by local image features—rather than full object recognition—and are supported by
activity in ventral cortical areas and the thalamus. In Aim 1, we will determine whether preferential viewing
is guided by local features or global object configurations using synthesized and modified images. In Aim
2, we will identify the neural representations that predict and drive preferential viewing by recording from
visual cortical areas V4, IT, and the pulvinar. In Aim 3, we will develop deep learning models trained to
reproduce both gaze behavior and neuronal response patterns, with a focus on generalization to social
and ecological stimuli. This project will define the neuronal and algorithmic basis of visual prioritization,
advancing our understanding of primate vision and informing the next generation of biologically inspired
AI. Insights from this work may lead to new applications in machine vision, social robotics, and
computational psychiatry.
RELEVANCE (See instructions):
This project seeks to understand how primates prioritize what they look at in complex visual scenes. By
combining brain recordings, behavioral testing, and deep learning models, we are working to define the
neural and computational rules that guide natural eye movements. The findings could help improve
artificial intelligence systems and inform tools for detecting social processing deficits in conditions like
autism.
Up to $401K
health research