TY - JOUR
T1 - It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach
AU - Dewhurst, Richard
AU - Nyström, Marcus
AU - Jarodzka, Halszka
AU - Foulsham, Tom
AU - Johansson, Roger
AU - Holmqvist, Kenneth
PY - 2012
Y1 - 2012
N2 - Eye movement sequences---or scanpaths---vary depending on stimulus characteristics and task (Foulsham \& Underwood, 2008; Land, Mennie, \& Rusted, 1999). Common methods for comparing scanpaths, however, are limited in their ability to capture both the spatial and temporal properties of which a scanpath consists. Here we validate a new method for scanpath comparison based on geometric vectors, which compares scanpaths over multiple dimensions retaining positional and sequential information (Jarodzka, Holmqvist, \& Nyström, 2010). `MultiMatch' was tested in two experiments and pitted against ScanMatch (Cristino, Mathôt, Theeuwes, \& Gilchrist, 2010), the most comprehensive adaptation of the popular Levenshtein method. Experiment 1 used synthetic data, demonstrating the greater sensitivity of MultiMatch to variations in spatial position. In experiment 2 real eye movement recordings were taken from participants viewing sequences of dots, designed to elicit scanpath pairs with commonalities known to be problematic for algorithms (for example, when one scanpath is shifted in locus, or fixations fall either side of an AOI boundary). Results illustrate the advantages of a multidimensional approach, revealing how two scanpath differ. For instance, if one scanpath is the reverse copy of another the difference is in direction but not the position of fixations; or if a scanpath is scaled down, the difference is in the length of saccadic vectors but not overall shape. As well as having enormous potential for any task in which consistency in eye movements is important (e.g. learning), MultiMatch is particularly relevant for "eye movements to nothing" in mental imagery research and embodiment of cognition, where satisfactory scanpath comparison algorithms are lacking.
AB - Eye movement sequences---or scanpaths---vary depending on stimulus characteristics and task (Foulsham \& Underwood, 2008; Land, Mennie, \& Rusted, 1999). Common methods for comparing scanpaths, however, are limited in their ability to capture both the spatial and temporal properties of which a scanpath consists. Here we validate a new method for scanpath comparison based on geometric vectors, which compares scanpaths over multiple dimensions retaining positional and sequential information (Jarodzka, Holmqvist, \& Nyström, 2010). `MultiMatch' was tested in two experiments and pitted against ScanMatch (Cristino, Mathôt, Theeuwes, \& Gilchrist, 2010), the most comprehensive adaptation of the popular Levenshtein method. Experiment 1 used synthetic data, demonstrating the greater sensitivity of MultiMatch to variations in spatial position. In experiment 2 real eye movement recordings were taken from participants viewing sequences of dots, designed to elicit scanpath pairs with commonalities known to be problematic for algorithms (for example, when one scanpath is shifted in locus, or fixations fall either side of an AOI boundary). Results illustrate the advantages of a multidimensional approach, revealing how two scanpath differ. For instance, if one scanpath is the reverse copy of another the difference is in direction but not the position of fixations; or if a scanpath is scaled down, the difference is in the length of saccadic vectors but not overall shape. As well as having enormous potential for any task in which consistency in eye movements is important (e.g. learning), MultiMatch is particularly relevant for "eye movements to nothing" in mental imagery research and embodiment of cognition, where satisfactory scanpath comparison algorithms are lacking.
KW - vectors
KW - Scanpaths
KW - MultiMatch
KW - ScanMatch
U2 - 10.3758/s13428-012-0212-2
DO - 10.3758/s13428-012-0212-2
M3 - Article
C2 - 22648695
SN - 1554-3528
JO - Behavior Research Methods
JF - Behavior Research Methods
ER -