Eye-movements as indicator for pre-articulatory self-monitoring

Previous speech production studies suggest that our viewing behavior is guided by the need for specific information relevant during conceptualization and encoding processes (cf. Bock et al. 2003). However, to what extent viewing behavior may also reflect information processing during pre-articulatory self-monitoring is an open question (cf. Griffin 2004). To approach this topic we asked Mandarin Chinese and German participants (N=15 in each group) to add a given number of days to a given calendar date (e.g.: “You have an appointment in 5 days. Today is ...”). We did not allow our participants to say aloud the result of their calculation, instead they had to verify whether the calendar date they had calculated matched the calendar date on a following visual display (target date). Half of all target dates did not match the correct results of the calculations. Participants gave “yes” or “no” answers. Note that Chinese and German differ with respect to the conventionalized order of day(d), month(m), and year(y) information (German: d-m-y, Chinese: y-m-d). The date given for the calculation task matched the respective conventionalized order in all trials. In addition to the group immanent experimental factor “conventionalized date format” we manipulated the format of the target date using two experimental blocks (N=30 each). In the first block the target date format matched the respective conventionalized date format of our two participant groups. In the second block we exchanged the target date format between groups, yielding a mismatch between the visual sequence of calendar date information and the language-specific conventionalized date format. We measured the sequence of looks to the day, month, and year region of the target date.

Eye tracking data from the first experimental block show a robust left-to-right order of looks in both groups. In the second experimental block we found that after a short adaption phase of about 6 trials the order of looks changed to a robust right-to-left order. Taken together our findings from both participant groups obtained under the match and mismatch condition, thus, strongly suggest that eye movements can be guided by a temporary linguistic representation which results from planning and encoding processes during speech production. We therefore conclude that in addition to the investigation of the conceptualization and formulation processes during language production, eye tracking may also offer new insights into pre-articulatory self-monitoring. Furthermore, the finding that viewing behavior adapts to changes of the spatial location of relevant information underlines that repeated visual search triggered by task demands may be abbreviated by establishing a task-dependent search pattern.

This website uses cookies to ensure you get the best experience on our website.privacy policy