Summary: A new study reveals the precise moment the brain detects gaze direction, improving our understanding of social interactions and disorders like autism and Alzheimer’s disease. The researchers used EEG and machine learning to analyze brain activity as participants looked at avatars with different head and eye directions.
They discovered that the brain processes head orientation first before eye direction, with task context influencing the accuracy of gaze perception. This breakthrough could contribute to the early diagnosis and treatment of autism and Alzheimer’s disease.
Highlights:
- Hierarchical processing: The brain first detects head orientation at 20 ms, then eye direction at 140 ms.
- Influence of the task: Accuracy in detecting gaze direction improves when attention is focused on gaze.
- Diagnostic potential: The findings could help in the early diagnosis and treatment of autism and Alzheimer’s disease.
Source: University of Geneva
Eye gaze plays a central role in everyday social interactions. Our ability to communicate instantly relies on the brain’s ability to detect and interpret the direction of others’ gaze. How does our brain detect gaze direction and what factors influence this process?
In a recent study published in the journal NeuroImageA team from the University of Geneva (UNIGE) has succeeded in determining with unprecedented precision the exact moment at which the direction of gaze is detected.
These results significantly improve our understanding of autism spectrum disorders and could offer therapeutic prospects for people affected by Alzheimer’s disease.
Human faces are the most common and constant visual stimuli we encounter from the moment we are born. Our brains have developed the expertise to memorize and recognize faces, as well as to interpret the messages they convey. For example, direct gaze signals a desire for social interaction, while avoiding eye contact conveys the opposite message. But how quickly can our brains understand the gaze of others?
This topic has been the subject of much research. However, existing publications mainly focus on the study of the eye region in isolation, neglecting other factors such as head orientation.
Brain analysis of gaze
A team from UNIGE presented study participants with 3D avatars, each with different head and gaze directions. In the first task, volunteers had to indicate the head orientation, while in the second task, they had to identify the eye direction.
By analyzing brain activity using an electroencephalogram, the research team discovered that these two processes can be reliably decoded independently of each other.
“The experiment also demonstrates a certain hierarchy in the processing of these two pieces of information. The brain first perceives the most global visual cues, i.e. the orientation of the head, from 20 milliseconds, before focusing on more local information, i.e. the eyes, from 140 milliseconds.
“This hierarchical organization then makes it possible to integrate information on the eye region and the orientation of the head, to ensure an accurate and efficient judgment of the direction of gaze,” explains Domilė Tautvydaitė, postdoctoral researcher and associate researcher at UNIGE, Faculty of Psychology and Educational Sciences, and first author of the study.
The study also shows that decoding gaze direction was significantly more accurate when participants were specifically asked to pay attention to the gaze of the faces presented. This means that the context of the task influences the perception and understanding of gaze.
“In everyday life, these results show that when people are actively engaged in a ‘social mode’, they recognise other people’s intentions better and more quickly,” explains Nicolas Burra, lecturer at the Faculty of Psychology and Educational Sciences and director of the Experimental Social Cognition Laboratory (ESClab) at UNIGE, who led this research.
A cutting-edge method
The method used provides extremely accurate results for both mechanisms. By integrating the analysis of neural activity by electroencephalography (EEG) with machine learning techniques, the research team was able to predict the decoding of gaze and head direction before the participants were even aware of it.
“This method represents a significant technical innovation in the field, allowing a much more precise analysis than was previously possible,” adds Nicolas Burra.
In people with autism spectrum disorders, the decoding of this information may be impaired and avoidance of eye contact may be favored. This is also the case in Alzheimer’s disease, where during the course of the disease, memory difficulties impoverish the person’s relationships with others and often lead to social withdrawal. It is therefore essential to understand the neural mechanisms of gaze direction detection.
The results of the study and the method used provide a concrete contribution to the early diagnosis of autism spectrum disorders in children. In Alzheimer’s disease, one of the most striking symptoms as the disease progresses is the inability to recognize faces, even those of family members.
This study thus paves the way for a better understanding of the neural mechanisms linked to the decrease in social interactions and facial memory, a subject currently studied by Dr. Tautvydaitė at McGill University in Canada. Research at UNIGE’s ESClab laboratory will continue in this area by analyzing these processes during real social interactions.
About this news on visual neuroscience research
Author: Antoine Guenot
Source: University of Geneva
Contact: Antoine Guenot – University of Geneva
Picture: Image credited to Neuroscience News
Original research: Free access.
“Timing of gaze direction perception: ERP decoding and task modulation” by Nicolas Burra et al. NeuroImage
Abstract
Timing of gaze direction perception: ERP decoding and task modulation
Distinguishing another person’s gaze direction is extremely important in everyday social interaction, as it provides crucial information about people’s attention and, therefore, their intentions.
The temporal dynamics of gaze processing were studied using event-related evoked potentials (ERPs) recorded by electroencephalography (EEG).
However, the timing of when our brain distinguishes gaze direction (GD), independently of other facial cues, remains unclear. To address this question, the present study aimed to investigate the time course of gaze direction processing, using an ERP decoding approach, based on the combination of a support vector machine and error-correcting output codes.
We recorded EEG from healthy young subjects, 32 of whom performed GD detection tasks and 34 of whom performed face orientation tasks. Both tasks presented realistic 3D faces with five different head and gaze orientations each: 30°, 15° left or right, and 0°.
While classical ERP analyses did not show clear GD effects, ERP decoding analyses revealed that GD discrimination, regardless of head orientation, began at 140 ms in the GD task and at 120 ms in the face orientation task. GD decoding accuracy was higher in the GD task than in the face orientation task and was highest for direct gaze in both tasks.
These results suggest that decoding of brain patterns is modified by task relevance, which changes the latency and accuracy of GD decoding.