Language and Visual processing

The Language and Visual processing group focuses on the interrelation between the visual and linguistic domain. What can eye movements tell us about linguistic processing? Does language modulate the way we perceive visual information? If so, how? How do manipulations of a visual stimulus surface in linguistic representations? How is attention shared between linguistic and visual processing?

Top-down influences on event apprehension

Not much is known about apprehension, the earliest stage of information processing in elicited language production studies, using pictorial stimuli. For example, (1) is the process influenced by the specific type of information that a speaker needs as a starting point for the preverbal message? (2) Is it a process that can be adapted to the specific time constraints of a task? To broaden our understanding of apprehension, we analyze landing positions and onset latencies of first fixations on visual stimuli given short stimulus presentation times, presupposing that the first fixation directly results from information processing during apprehension.


VIPICOL - Visual Information Processing In the Context Of Language

The details on how language shapes visual behavior have mainly been studied using eye tracking. However, there are two (severe) problems. 1.) Visual information uptake is possible without directly fixating the region that contains the relevant information (parafoveal processing), and 2.) fixations reflect scene comprehension, object identification, as well as information retrieval from the mental lexicon, which makes it challenging to relate fixation patterns to specific phases during the verbalization process. The aim of this project is to develop and evaluate an experimental procedure with which it will be possible to tackle the problems pointed out above.


Tracking gaze movement while construing and talking about events: a cross-linguistic approach

In this project, we deal with the interrelation between patterns in gaze movement, when watching dynamic video clips, and what is mentioned at what point, when talking about events.


Visual attention as a window to cognitive processing – A new method to analyze eye tracking data elicited from dynamic scenes

The goal of this project is to develop a new method for the analysis of eye tracking data, which can be used to investigate attention allocation patterns of humans who are presented with dynamic stimuli.


Eye-movements as indicator for pre-articulatory self-monitoring

Previous speech production studies suggest that our viewing behavior is guided by the need for specific information relevant during conceptualization and encoding processes. However, to what extent viewing behavior also reflects information processing during self-monitoring is an open question.


Projects associated with this group

This website uses cookies to ensure you get the best experience on our website.privacy policy