Kinematics of Visually-Guided Eye Movements

One of the hallmarks of an eye movement that follows Listing’s law is the half-angle rule that says that the angular velocity of the eye tilts by half the angle of eccentricity of the line of sight relative to primary eye position. Since all visually-guided eye movements in the regime of far viewing follow Listing’s law (with the head still and upright), the question about its origin is of considerable importance. Here, we provide theoretical and experimental evidence that Listing’s law results from a unique motor strategy that allows minimizing ocular torsion while smoothly tracking objects of interest along any path in visual space. The strategy consists in compounding conventional ocular rotations in meridian planes, that is in horizontal, vertical and oblique directions (which are all torsion-free) with small linear displacements of the eye in the frontal plane. Such compound rotation-displacements of the eye can explain the kinematic paradox that the fixation point may rotate in one plane while the eye rotates in other planes. Its unique signature is the half-angle law in the position domain, which means that the rotation plane of the eye tilts by half-the angle of gaze eccentricity. We show that this law does not readily generalize to the velocity domain of visually-guided eye movements because the angular eye velocity is the sum of two terms, one associated with rotations in meridian planes and one associated with displacements of the eye in the frontal plane. While the first term does not depend on eye position the second term does depend on eye position. We show that compounded rotation - displacements perfectly predict the average smooth kinematics of the eye during steady- state pursuit in both the position and velocity domain.


Introduction
Tracking the motion of a small object across a structured visual world challenges the constancy of spatial orientation due to the visual consequences induced by the eye movements. Since visuallyguided eye movements have an intricate eye-position dependent kinematics the optic flow induced by tracking eye movements depends in a complex way on the location and geometry of the target trajectory in the visual field. In the simplest case a distant object may move in a plane, which happens to include the observer's line of sight to the fixated target. The brain may then compensate for the movement-induced optic flow by simple image translation [1]. However, in many other situations the movementinduced optic flow is likely to be highly nonlinear, making a simple image translation impractical. According to H. v. Helmholtz perceptual stability is achieved by an estimation process of the visual consequences based on efference copy signals that are derived from the motor commands to the eye muscles [2]. This suggestion presupposes that the brain can efficiently estimate the three-dimensional kinematic consequences of the motor commands that generate the desired tracking motion of the eye. Although far vision is two-dimensional it matters for keeping visuospatial orientation stable to be informed not only about the current gaze displacement but also about how much the peripheral retina rotates about the line of sight. Since up to date there is not enough information about the geometric relationship between motor commands and three-dimensional ocular kinematics during smooth tracking of an object of interest, our understanding of the interactions between retinal and extra retinal signals remains necessarily limited. A major goal of this study is to bridge this gap starting from basic motor principles. There are two basic low level mechanisms that do constrain the kinematics of all visually-guided eye movements. One mechanism is Donders' law, which asserts that the eye, while holding the head still, assumes always the same orientation for every fixation direction, independent of the preceding eye movement [3]. The other mechanism is Listing's law, which implies that the eye can only assume certain specific orientations relative to the head [2,[4][5][6]. To reach those orientations the eye must rotate in planes that define, by way of intersection, a particular single direction in visual space, which has been called primary direction. This direction is distinguished by the unique property that any other direction in the visual field of fixations can be reached by a single rotation of the eye in the plane spanned by primary direction and the new desired direction. Despite its theoretical importance the notion of primary eye position and direction defies any more operational definition. Although there exist recursive procedures based on evaluating eye positions relative to a fixed reference position in far vision, while keeping the head upright and still [2,4,7], its neurophysiological significance in basic oculomotor research remains elusive. Since we rarely move the eyes with the head and body still the issue of how these basic mechanisms are imbedded in the larger context of head-free motor behavior has been intensively studied. While the head contributes to gaze movements, a major factor complicating the analysis of the basic role of Donders' and Listing's law in eye position control is the intricate interaction of visual and vestibular signals [8,9]. Since the general relationship between eye position and position-dependent angular eye velocity signals related to Donders' and Listing's law is poorly understood, interactions with vestibular and other signals in the nested eye-head motor control system are difficult to discern. It appears therefore indispensable to more closely analyze this relationship in order to be able to segregate visual from vestibular and other effects in terms of the overall angular eye velocity in neurophysiological studies. In the current oculomotor literature it is often tacitly assumed that the angular velocity of a Listingmotion of the eye tilts by half the angle of gaze eccentricity with respect to straight ahead, although this is guaranteed only for fixed-axis rotations of the eye. The following analysis focuses on smooth tracking eye movements, which stand out as one of the prime examples of a Listing-motion [10][11][12][13][14]. Since visual targets can rarely be tracked by a single-axis rotation, it is still a mystery how such eye movements are generated within the constraints of Listing's law. Here we propose a generic rotation algorithm based on the principle of minimizing ocular torsion. It generates smooth Listing-motions of the eye by operating linearly on the orientation of the line of sight for small rotation angles. Based on this algorithm we analyze the relationship between angular eye position and velocity of a general Listing-motion. Since such generic algorithm has not been known up to date, a Listing-motion of the eye has traditionally be conceived as a series of compounded rotations, also called virtual rotations to and from primary position between each fixation [2,15] (Fig. 1). Taken as motor control strategy during smooth tracking movements such an algorithm not only implies a considerable amount of computation for every single instant of ocular motion but also lacks plausibility in terms of a time-critical strategy for target tracking. Besides combined eyehead gaze shifts smooth tracking movements with the head still or moving are typically non-fixed-axis rotations because the interesting target can rarely be smoothly tracked otherwise [16][17][18]. For testing the predictions of our mathematical analysis of the characteristics of a general Listing-motion, therefore we used three-dimensional eye movements that had been earlier recorded in non-human primates during linear and curvilinear smooth pursuit [11,18].

A Rotation Operator that Generates Listing-motions of the Eye
We show that it is possible to replace the virtual rotations illustrated in Fig. 1 by two explicitly defined single rotation operators. First we define a compound rotation operator R CF r,j ð Þ :~R C r ð ÞR F j ð Þ consisting of a first rotation of the eye through j in the head's frontal plane followed by a rotation through r in the eye's coronal plane and the requirement that the rotation angles fulfill the relation r~{j (Fig. 2). In contrast to compound rotations obtained by composing rotations in mutually orthogonal planes according to Euler, the rotation planes used to construct the compound rotation R CF are not orthogonal to each other. In the following we show that this operator generates a Listing-motion by acting linearly on the line of sight for small rotation angles.
We define the direction of the line of sight by the unit gaze vectorĝ g~P 3 i~1 g iê e i with coefficients g 1~c os e, g 2~{ sin e sin y and g 3~s in e cos y using the spherical polar coordinates e and y ( Figs. 1 and 2 . To take advantage of the Clifford algebra of rotations, we introduce the basis vectorsĉ c i (i = 1, 2 and 3), which are defined by the propertiesĉ c i ð Þ 2~I (identity) andĉ c jĉ c k zĉ c kĉ c j~2 d jk I with d jk~1 for j = k and d jk~0 if j ?k (for more details, see Text S1). In this basis, the unit gaze vectorĝ g is represented by the 1-vectorĝ g~P 3 i~1 g iĉ c i , using the same coefficients g i (i = 1, 2 and 3) as in Euclidean space. Furthermore, the frontal, sagittal, and horizontal planes are represented by the three 2-vectorsĉ c 23 :~ĉ c 2ĉ c 3 ,ĉ c 31 :~ĉ c 3ĉ c 1 and c c 12 :~ĉ c 1ĉ c 2 , respectively. In the following we abbreviate linear combinations of these three basic 2-vectors by P i a iĉ c jk :~a 1ĉ c 23 za 2ĉ c 31 za 3ĉ c 12 . The coronal plane of the eye is represented by the 2-vectorĉ c ey :~ĉ c eĉ c y wherê c c e~Lĝ g=Le~P 3 i~1 Lg i =Leĉ c i and c c y~Lĝ g=Ly~P 3 i~1 Lg i =Lyĉ c i . We now explicitly define the rotation operator R DL :~R CF r,j ð ÞD r~{j (Fig. 2): R DL~I cos j=2z sin j=2ĉ c ey À Á I cos j=2{ sin j=2ĉ c 23 ð Þ c c ey~X i g iĉ c jk~ĉ c 23 cos e{ sin eĉ c 31 siny{ĉ c 12 cos y ð Þ where r is the rotation angle in the eye's coronal plane (given bŷ c c ey ), j is the rotation angle in the frontal plane (given byĉ c 23 ) and e is the eccentricity of the line of sight. In the direction straight ahead, we haveĉ c ey D e~0~ĉ c 23 and thus R DL~I . Note that the condition r~{j implies Donders' law because it reduces the dimension of the manifold of ocular rotations from three to two. Since the torsion of visually-controlled eye movements is typically small we can expand the operator R DL up to terms linear in j using the approximations cos j&1 and sin j&j. The resulting simplification renders the following calculations feasible and a posteriori proof to be sufficient for characterizing smooth visuallyguided the eye movements. We will refer to the infinitesimal form of R DL specifically as Donders-Listing operator dR DL : This linear operator functions as a generator of torsion-free rotations, i.e. motions that preserve Listing's law in the oculomotor range up to second order corrections in e and j (for a proof, see Text S2). In fact dR DL mediates a rotation of the line of sight in the head's frontal planeĉ c 23 by rotating the eye in the tilted planê c c DL . Indeed under the action of dR DL the gaze vectorĝ g rotates from its current position through the angle j to the new positionĝ g' (see motion of unit gaze vector fromĝ g*OA ! toĝ g'*OB ! in Fig. 3A): The approximation on the right side follows from the relation sin e=2ĉ c DLĝ g{ĝ gĉ c DL ð ÞLĝ g=Ly and by observing that j 2 sin 2 e=2ĉ c DLĝ gĉ c DL represents only a small contribution of second order. In contrast, the rotation plane of the eye ball tilts by half the line of sight's eccentricity, namely through the angle DL . Thus the eigenvalue of dR DL approximates the eigenvalue of a proper rotation up to corrections quadratic in j and e/2, noting that j,1 and e/2,1. Also note that at each gaze position the angle subtended by the rotation planesĉ c ey andĉ c DL is t e~c os {1 Sĉ c DL ,ĉ c ey T À Á p=2 {e=2 , independent of the actual orientation of the eye. To track a target that happens to move along a directioncircle (T: left panel), eye position signals holding the eye in the required plane of rotation could theoretically be derived by the following procedure (Helmholtz 1867): To obtain a smooth motion from A to B along the associated direction-circle (white circle through pupil, T, and F), the following four virtual rotations must be compounded: A first rotation in the sagittal plane NAO through an angle g subtending the arc AO, abbreviated by R OA g ð Þ, a second rotation in the frontal plane LNR through j subtending the arc NN9, abbreviated by R F j ð Þ, a third rotation in the meridian plane OBN9 through g9 subtending the arc OB, abbreviated by R BO g' ð Þ, and finally a forth rotation in the eye9s coronal plane through -j, abbreviated R C {j ð Þ to eliminate the acquired torsion. Denoting byĝ g A andĝ g B the unit gaze vectors parallel to EA ! and EB ! , respectively, we have altogether  To generate a general Listing-motion, the Donders-Listing operator must be combined with an operator mediating meridian rotations, that is horizontal, vertical and oblique rotations, whereby each of the two operators act in mutually orthogonal planes. The rotation plane of the meridian operator is defined by c M :~ĉ c rĉ c e withĉ c r~Lĝ g=Lr :ĝ g andĉ c e~Lĝ g=Le . Thus we have: moves the unit gaze vector from positionĝ g i{1 to positionĝ g i . If the rotation in the meridian plane is small the rotation operator R M can be approximated by its linear version dR M . So far equation 1 and 2 suggest that essentially only two displacement signals, namely j BA~yB {y A and g BA~eB {e A are needed to command a Listing-motion of the eye from position A~y A ,e A ð Þ to B~y B ,e B ð Þ. The two signals can be expressed in a one-to-one fashion in terms of azimuth and elevation of the eye as shown in the paragraph ''Parameterizing Listing-motions in visual space''.

The Total Angular Velocity of the Eye
Although recursive application of the infinitesimal compound rotation operator dR M g ð ÞdR DL j ð Þ does generate eye movements in rotation planes that tilt by half the angle of gaze eccentricity, the question remains whether the angular velocity also follows the half-angle law of Helmholtz. To approach this question first we expressed the total rotation of the eye as R eye r,j,g ð Þ~R CF R M by compounding R CF r,j ð Þ (as earlier defined) and the meridian rotation operator R Mĉ c M ,g ð Þ. Left-multiplying the velocity d/ dt(R eye ) by the inverse R {1 eye , we obtained the angular velocity (see e.g. [19]): M dR M =dt . The expression for V CF can be further broken down to V CF~VF zR {1 F V C R F in terms of a roll angular velocity V F~2 R {1 F dR F =dt with R F :~Rĉ c 23 ,j ð Þ and a coronal or counter-roll angular velocity V C~2 R {1 C dR C =dt with R C :~Rĉ c ey ,r À Á . Clearly, the rotation plane of V M does not depend on the eccentricity of the line of sight in contrast to the term V DL . As a consequence, the total angular eye velocity does not obey the half-angle law of eye position (which always holds) because the rotation plane of V M does not tilt. Next we analyzed the two terms on the right side of equation 3 in more detail.

The Donders-Listing Angular Velocity
The counter-roll angular velocity is explicitly V C~{ dr=dtĉ c ey . Similarly, the roll angular velocity is V F~{ dj=dtĉ c 23 . Substituted into V CF and evaluated at r~{j, one obtains Fĉ c ey {ĉ c 23 À Á R F D~Dĉ c ey {ĉ c 23 D~2 sin e=2 , so finally we obtained: Visually-Guided Eye Movements v' 3~c os e=2 cos j{y ð Þ Thus, the plane of rotation of the angular velocity V' DL tilts in accord with the half-angle rule [4], at least if there is no meridian rotation involved. If there is such rotation, one has to go one step further by evaluatinĝ Fĉ c DL R F R M~P i v iĉ c jk , which yields the following expression, referred to as Donders-Listing angular velocity (for small rotation angles j): The angular velocity plane of the Donders-Listing angular velocity tilts through an angle ð Þ, independent of the meridian y. Thus we found that the rotation plane of the Donders-Listing angular eye velocity must not coincide with the rotation plane of eye position. The half-angle law of eye position translates into a modified half-angle law of angular eye velocity. However, note that often the rotation angle g will be much smaller than e/2, which obscures small differences between these two rotation planes.

Meridian Angular Velocity
The contribution of V M depends on the time rate of change of both the rotation angle g and the angular orientation y of the meridian plane.
With the meridian planê c c M~{ĉ c 31 cos yzĉ c 12 sin y ð Þ we have for the meridian angular velocity: The second term in this relation implies a small change in The two planesĉ c M andĉ c LM are mutually orthogonal. Note that dy=dt and dj=dt are equivalent. Thus we can write: v 1~0 , v 2~c os y, v 3~s in y s 1~s in g=2, s 2~{ cos g=2 sin y, s 3~{ cos g=2 cos y The dynamic interaction term on the right side of (5), which depends on the rotation angle g and the rotation velocity dj=dt can make a significant contribution to ocular torsion as shown next.

Ratio of Counter-roll to Target-induced Roll Angular Eye Velocity
To estimate the angular eye velocity induced by a visual target one has to know the counter-roll angular velocity, which contributes to the total angular velocity of the eye but not to the target angular velocity that is encoded by the fovea. We define the target-induced angular velocity as the difference Dividing both sides by the magnitude DDVD and rearranging the summands we obtained the following equation: Here we have introduced the ratio l~DV C D=DV D D , noting that DV'' C D~DV C D and the abbreviations f~V eye =DDVD ,ĉ c D~D V=DDVD andĉ c'' ey~V '' C =DV C D . The l-ratio is a complicated function describing the magnitude ratio of counter-roll to target-induced angular velocity in dependence of eye position (coordinates e and y) and rotation (rotation angles j and g). To minimize accumulation of torsion, this ratio must be such that the counter-roll angular velocity compensates the target-induced angular velocity in the frontal plane (see Fig. 2B). Using this condition we evaluated the equation by computing the scalar product of f andĉ c 23 , i.e. Sf ,ĉ c 23 T~Sĉ c D zlĉ c'' ey ,ĉ c 23 T, setting it equal to zero and solving for l. We obtained for small rotation angles g (for calculation details, see Text S3): It describes the ratio of DV C D to DV D D~DV M zV' F D as a function of gaze eccentricity e and the magnitude of the gradient Lg=Lj , which we will refer to as counter-roll to roll angular velocity ratio. In case of a simple target-induced angular velocity in the frontal plane it reduces to l~1=g 1~1 = cos e , predicting that the counter-roll angular velocity must increasingly outmatch the roll angular velocity in magnitude as gaze eccentricity increases [18]. Conversely, it also predicts that the counter-roll angular velocity will undershoot the roll angular velocity in magnitude if there is a gradient Lg=Lj such that ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffi 1z Lg=Lj ð Þ 2 q w1. In primary direction, one has l~1, suggesting that fluctuations in roll angular velocity are matched on average in magnitude by counter-roll angular velocity signals.

Parameterizing Listing-motions in Visual Space
Target trajectories along straight lines in Euclidean space correspond to either small or great circle arcs in the spherical field of fixations. In contrast to great circles through the fixation point straight ahead, small circle -trajectories cannot be tracked by a torsion-free rotation of the eye in a single rotation plane. To approximate such trajectories, the eye must perform compounded rotations in meridian and frontal planes. The underlying transformations involve trigonometric relations between the azimuth and elevation of the desired fixation points and the respective meridian and eccentricity angles. For example, tracking of a target along an eccentric horizontal trajectory, corresponding to a horizontal circular arc in the upper hemisphere of the visual field, involves transformations of the unit gaze vectorĝ g~ĝ g e,y ð Þ according to the relations.  Fig. 4). Since the resulting rotation is torsion-free up to second order corrections in j the overall torsion generated by a series of such compound rotations depends only on the chosen size of the Donders-Listing rotation steps. Simulations showed that the accumulated torsion across a 640u horizontal excursion was linearly dependent on the number of sampling points (tested range 10# N #1000), reaching about 0.025u with an average j j = 0.5u 60.2u for N = 100 sampling points (Fig. 4). The angular ratio of the tilt of the rotation plane of the eye to the tilt of the gaze line followed the half-angle law [2] whereas the tilt of the rotation plane of the gaze line was virtually zero (Fig. 4A, B). Note also that the eccentricity of the gaze line increases significantly across the illustrated range of tracking compared to tracking onset at azimuth q = 0 (compare the change of e with q in Fig. 4A).

Smooth Pursuit Angular Eye Position and Velocity Predicted from Target Motion
To check the predictions of equations 1 and 2 that twodimensional position signals are sufficient to generate a Listingmotion, which is characterized by equations 3 to 6, we studied position and angular velocity of smooth pursuit eye movements during tracking of linear-and curvilinear-moving targets. For geometric reasons, such eye movements cannot be smooth and at the same time perfectly obey Listing's law except in case of targettracking along great circle arcs in visual space. Consider for example smooth tracking of a circularly moving object in a frontal plane, where the eye has to approximate the object's path by a multi-sided polygon curve consisting of a series of small arcs of direction-circles. Indeed, such arcs only can approximate the circular path because of different curvatures: The curvature of a small circle arc with aperture 2e (e, gaze eccentricity) is always larger than the curvature of a direction-circle tangent to that arc (Fig. 1). Similar considerations hold for tracking objects along small circle arcs associated to straight lines in visual space (Fig. 4). Obviously, there must be a trade-off between smoothness of gaze motion and accord with Listing's law during steady-state target tracking [18].
Before addressing the complex paradigms of circular and elliptic tracking we present the results of linear smooth tracking, applying equations 1-5 to eye movement records obtained during tracking of targets that oscillated along horizontal or vertical small circles at various eccentricities relative to straight ahead. To reconstruct the ocular rotation, we used the initial orientation of the gaze line at tracking onset and an internal model estimating the target's distance and orientation in the subject's frontal plane (see Methods). From these pieces of information the rotation of the eye was reconstructed as a function of gaze orientation given by the polar angles e and y (see Fig. 1 and 4). The reconstructed rotations reflected the smooth motion of the eye (up to first order in torsion, see equation 1), disregarding any step-like modulations of eye position due to minute saccades. Accordingly no modulation of torsional eye position was generated in contrast to the experimentally observed torsion due to saccades and subsequent drifts (Fig. 5A: compare sinusoidal fit in black with reconstruction in gray). The reconstruction predicts zero torsion offset relative to straight ahead, in agreement with the fact that the experimentally observed average torsion offset in the example illustrated in Fig. 5A added up to close to zero. Averaging across the five different gaze eccentricities and the two animals, the torsional modulation of eye position had an average amplitude (6SD) of 0.6u (60.85u) and offset of 0.07u (60.8u) during horizontal tracking (N = 44 cycles). Similarly, during vertical tracking the average amplitude (6SD) was 0.8u (60.7u) with offset of 0.004u (60.6u) (N = 47 cycles). Close inspection of vertical and horizontal eye positions revealed that the torsional saccades had also horizontal and vertical components during horizontal and vertical tracking, respectively, that changed direction at the turning points. The overall magnitude of these saccades was about 1u to 1.5u.
Again, the reconstructed eye position reproduced the average horizontal and vertical smooth modulation (Fig. 5C). Averaging across the five different eccentricities and the two animals, we found mean coefficients of determination (6SD) of vertical and horizontal reconstructions of 0.5960. 30  We also compared the degree of accordance with Listing's law based on the experimental slow phase angular velocity and the total angular velocity derived from the reconstructed eye position. The conventional method compares the tilt of the angular eye  (Fig. 6A, compare black and gray traces). At these times during the tracking cycles, the instantaneous ocular rotation was tangential to current eye position. As the eye rotated away from these meridian crossing points during tracking, the tilt of the rotation plane of the total angular velocity remained constant (black traces in Fig. 6A), whereas that of Donders-Listing angular velocity increased in absolute terms due to the increasing distance relative to straight ahead (Fig. 6A, gray traces).
To further elaborate on this observation, we plotted the ratio of tilt angles to estimated target distances against estimated target distance. Target distance was defined by D~ffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi E 2 ver zE 2 hor q (expressed in degrees). Tilt angles and target distances were determined at increments of one degree by averaging across 60.25u throughout the tracking cycle. The tilt angles increased virtually linearly with slopes close to 0.50 as a function of target distance (Fig. 6B, example for horizontal pursuit at 15u gaze up). Averaging across the two subjects and paradigms we found that the ratio of tilt angle to target distance was 0.50 (61610 25 , N = 290) in the position range of 5u to 10u around straight ahead and decreased thereafter to 0.4975 (60.0006, N = 366) for distances between 10u and 15u and further to 0.492 (60.001, N = 309) for distances beyond 15u (Fig. 6C).

Smooth Tracking of Targets along Elliptic Trajectories
As model for the following analysis served eye movement records of elliptic target trajectories with three different eccentricities (semi-major axis 20u, semi-minor axis 15u, 10u or 5u), oriented horizontally or vertically. Application of equations 1 and 2 based on the recorded meridian eye positions (angle y) and an internal model of elliptic motion (see Methods for details) perfectly reproduced the average smooth eye movement, except for the large saccadic modulation of torsional position (Fig. 7A). Averaged across the three classes of elliptic paradigms, we found a mean coefficient of determination of 1.  Visually-Guided Eye Movements PLOS ONE | www.plosone.org which often gave rise to torsional drifts in the amplitude range of 1-2u as shown earlier [18]. Accordingly, the reconstructed angular eye velocity did neither reproduce these experimentally observed torsional oscillations nor did it perfectly match the magnitude of the horizontal and vertical slow phase modulation. Averaged across the three elliptic paradigms, we found mean coefficients of determination for torsional, vertical and horizontal angular velocity of 0.  Fig. 7A, B). To further corroborate this finding we compared the rotation planes.
We computed both the modulation of the tilt angle of the reconstructed angular eye velocity as well as that of the sinusoidalfitted slow phase angular eye velocity and compared both of these to the tilt-modulation predicted by the Donders-Listing angular velocity. We found that in absolute terms the modulation of the tilt angle of the reconstructed angular eye velocity undershot the contour predicted by the gaze modulation during tracking in between the vertices of the elliptic track. This difference reflected the contribution of meridian angular velocity to the total angular velocity, which does not depend on eye position. In contrast, the modulation of the tilt angle of Donders-Listing angular velocity closely reproduced the expected contour: It modulated in synchrony with gaze eccentricity across the whole target cycle, as dictated by the elliptic trajectory (Fig. 8, compare black and dark-gray traces). On the other hand, the tilt angle of the sinusoidal fits of slow phase angular eye velocity (Fig. 8, light-gray traces), which included torsional saccadic drift velocities modulated in approximately the same range but phase shifted and distorted compared to both the Donders-Listing and the total angular eye velocity. Although this modulation greatly overshot the extreme vertices of the target trajectories, where the torsional angular velocities was high ($50u/s) it conformed to the half-angle rule on average fairly well as documented in Table 1.

Ratio of Counter-roll to Roll Angular Velocity
We estimated the ratio of counter-roll to roll angular velocity using two independent procedures. First, we estimated this ratio from angular eye position and velocity records and found that it deviated from the expected 1= cos e curve. Specifically, for tracking target trajectories with small and intermediate eccentricities (semi-minor axes b = 15u, 10u versus semi-major axis a = 20u), the ratio approximately stayed constant after the target crossed the vertex e = b of the elliptic trajectory at values close to 1= cos b before turning towards and reaching the predicted value 1=cosa at the vertex e = a (Fig. 9, light-gray traces in upper panels). For trajectories with large elliptic eccentricity (semi-minor axis 5u, semi-major axis 20u), it markedly undershot the curve 1= cos e between the vertices at e = a and e = b. In each of these cases the ratio accorded with the values of 1= cos e at the four vertices where the gradient Lg=Lj vanished, as predicted by equation (6) (Fig. 9, light-gray traces in upper panels). The same experimentally estimated counter-roll-to-roll ratios are also illustrated as a function of tracking phase (Fig. 9, light-gray traces in middle panels).
Secondly, we reconstructed the ratio of counter-roll to roll angular velocity based on equation 6 using the same set of polar angles y and e as used for reconstructing eye position and angular eye velocity. We found that across all horizontal and vertical tracking paradigms the thus reconstructed ratio predicted the experimental ratio with an average coefficient of determination of 0.8960.04 and average root-mean square error of 0.00360.002 (N = 328, Fig. 9, black traces superimposed on light-gray traces in upper and middle panels).

Discussion
We have shown that the rotation of the eye during general smooth tracking movements can be modeled by combining conventional rotations in horizontal, vertical or oblique planes with small displacements in the frontal plane. Further we have found that this novel type of compounded rotation -displacements can explain how the eye approximates rotations required to steer the fixation point along any desired path in visual space without accumulating torsion during smooth tracking. This rotation strategy of the eye has three characteristic features: First, it is the basis of Donders' and Listing's law. Second, it does not depend on where exactly primary position is located. Third, it relies on

A Generic Mechanism Underlying Donders' and Listing's Law
Since the early attempts of Helmholtz to derive Listing's law from a visuo-motor error functional, which he called the principle of easiest orientation [2] (for a re-evaluation of Helmholtz's approach see [15]), a number of other studies have shown that Listing's law at least can be understood as a principle minimizing certain motor parameters [6,15,[20][21][22][23]. The more recent discovery of fibro-muscular structures that alter the pulling directions of the oculomotor muscles has revived this discussion about the origin of Listing's law by proposing that the so-called half-angle rule of angular eye velocity [4] is implemented by an ingenious neuro-mechanical mechanism located peripheral in the ocular plant [24][25][26][27]. This hypothesis hinges on the issue of how the rotational mechanics of the eye effectively works, particularly with regard to the mechanisms that underlie the half-angle law of eye positions in Listing's law (for the half-angle law, see notion of direction-circle in [2]). Despite great progress, some of the basic visuo-motor mechanisms in oculo-motor control are still not well understood. For example, how can we look around a circle although Listing's law forbids rotations of the eye in the frontal plane? Or why should primary position play such a pivotal role in Listing's law when it is often found far from the center of the oculomotor range [4,28]? The here proposed rotation-displacement mechanism solves these problems.
As to primary direction, assume that it is somewhere down from the center of the oculomotor range when looking straight ahead (with the head upright). Nonetheless the proposed displacement operator dR DL that controls rotations of the fixation point in frontal planes would not change the torsion of the eye with respect to straight ahead (up to corrections quadratic in the rotation angle). Similarly, any rotation in planes whose mutual intersections coincide with this direction would not change torsion either, even not in combination with the displacement operator dR DL . Taking primary position into account would thus not change the torsion of the eye. We conclude from this that, for maintaining visuo-spatial orientation constancy, it does not really matter where exactly primary eye position is located. However, it does make a difference in terms of computational load whether every single eye position had to be computed relative to primary position (see example in Fig. 1) or whether the proposed alternative rotation-displacement strategy is used.
Our approach also explains the paradox of different rotation planes of the line of sight and the eye ball. For example, under the sole action of the displacement operator dR DL , the fixation point can almost perfectly approximate a circular trajectory in a frontal plane while the underlying rotation of the eye occurs at any point in time in planes tilted by half the angle of gaze eccentricity relative to straight ahead. Indeed, these tilted planes are eigenplanes of torsion-free rotations of the eye up to quadratic corrections in the rotation angle j. Under the action of dR DL , the fixation point displaces tangential to both its direction-circle, which is tilted by e/2 and a frontal circle with opening angle e relative to straight ahead. By adding up such displacements, the eye can generate an almost perfect circular trajectory of the fixation point while tracking the target. Similarly, during tracking of a target along a small circle arc, for example a horizontally moving target in the upper visual hemisphere, the rotation plane of the tracking fixation points remains almost perfectly parallel to the horizontal plane (Fig. 4). However, the eye actually rotates in planes that are tilted away from the horizontal plane by half the angle of gaze eccentricity, by combining small displacements in these tilted planes with rotations in meridian planes. Again the overall rotation is approximately torsion-free because these small displacements occur orthogonal to the meridian and tangential to direction-circle of the current fixation point at any point of time.

The Half-angle Law of Eye Positions does not Readily Generalize to Angular Eye Velocity
Our reconstructions of 3D eye position perfectly matched 3D experimental eye positions except for the saccadic modulation of torsional eye position. Similarly, the minute saccadic displacements in both the vertical and horizontal eye position modulation were averaged out by this approximation. As expected, the thus reconstructed eye positions were in accord with the half-angle law up to angular corrections of the order of j 2 , where j is the rotation angle in the frontal plane (equations 1 and 2, Figs. 5A, 7A). The experimentally observed modulation of torsional eye position is due to the same saccades observed in horizontal and vertical eye position and the ensuing drifts. During straight-line tracking, the Tracking of targets in clockwise (cw) and counterclockwise (ccw) direction along elliptic trajectories with major axes aligned with the head-horizontal and (B) head-vertical plane (ellipse major axis 20u, minor axis 10u). Traces illustrate tilt angles of rotation planes of reconstructed angular velocity (in black), Donders-Listing angular velocity (in dark gray) and sinusoidal-fitted angular velocity (in light gray). The disparity between reconstructed and Donders-Listing rotation planes is due to meridian rotations along the elliptic trajectory. The rotation planes of the sinusoidal-fitted angular velocities show large overshooting of the 5u and 10u levels predicted by the half-angle rule, particularly during tracking along horizontal elliptic trajectories; several single trials superimposed. Abscissa: phase 0u, up gaze; phase +90u, rightward gaze position (rotation sense of targets as seen from the subject). doi:10.1371/journal.pone.0095234.g008 amplitudes of this modulation increased or decreased in a mirrorsymmetric fashion relative to straight ahead, reaching about 1u at the most eccentric gaze position (Fig. 5A). The direction changed at the turning points during straight-line tracking and depended on the rotation-direction during circular or elliptic tracking eye movements [18]. The modulating saccades likely occur for geometric reasons. One of these reasons is to keep the target centered on the fovea. During straight-line tracking, a correction of eye position by 1u tangential to a maintained eccentric position of 15u generates an ocular torsion of about 0.13u. Repeated corrections in alternating direction during the tracking cycle can explain the observed modulation. Other reasons can be to correct and avoid accumulation of torsion. Since the Donders-Listing operator is a linear approximation it does not perfectly compensate the torsion associated with the generated eye movement. Without saccadic corrections, curvilinear smooth pursuit would first of all violate Donders' law, which is of primary importance for visuo-spatial orientation constancy [18].
To provide a more formal argument, consider smooth tracking of a target that moves in a frontal plane along a circular path with radius 15u at 36u/s. With an assumed updating rate of eye position every 500 ms without any correction of ocular torsion, the proposed Donders-Listing mechanism would move the fixation point along an icosagon that approximates the target's circular path. As a consequence, ocular torsion would accumulate by about 0.05u per smooth motion segment, increase to a maximum of about 0.5u after 10 segments (at about 180u) and decrease thereafter without hitting the initial zero position after 20 segments (at 360u): There would be a violation of Donders' law amounting to about 0.5u at the end of one response cycle. This value would multiply by the number of continuously tracked cycles. In practice, however, subjects can easily track circular or even elliptic targets through several cycles without problem, although elliptic tracking is more challenging because of the potentially large torsional angular velocities.
For simplicity, we reconstructed 3D eye position at the same updating or sampling rate of 833.33 Hz as recording the experimental data. To check the effect of a more plausible physiological updating rate, we found that the reconstructed positions still matched the data in excellent accord with the halfangle law for rates as low as 5 to 10 Hz. If we take the standard deviation of Listing's plane, the expected deviations of the order j 2 Figure 9. Ratio of counter-roll to roll angular velocity. Comparison of reconstructed versus experimentally estimated counter-roll to roll ratios during horizontal (A) and vertical (B) elliptic target tracking. Upper panels: Counter-roll to roll ratios obtained from single trials for three different elliptic eccentricities plotted against the angular eccentricity e of the gaze line by superimposing reconstructed (black lines) on experimentally estimated data (gray lines). Dashed vertical lines indicate the extreme vertices of elliptic trajectories (e = 0.66, 0.87 and 0.97; a = 20u, semi-major axis, b = 15u, 10u, and 5u, semi-minor axes). The single black curve displays the curve 1= cos e , extending from 1 at e = 0u to 1.15 at e = 30u. Middle panels: Counter-roll to roll ratios as above plotted against tracking phase. Note increasing depth of modulation with increasing elliptic eccentricity, particularly during horizontal elliptic tracking. Abscissa: phase 0u, up gaze; phase +90u, rightward gaze position. Bottom panels: Torsional target velocity along the three elliptic trajectories with eccentricities 0.97, 0.87, and 0.66 plotted against phase angle. Dashed line indicates average torsional angular velocity across the three trajectories (36u/s). doi:10.1371/journal.pone.0095234.g009 are well within the order of magnitude, which have been reported for straight-line and curvilinear pursuit in the position domain [18,[10][11][12]29].
We have shown that in general the angular orientation of the rotation plane of the total angular eye velocity (equation 3) does not only depend on gaze eccentricity as suggested by the half-angle rule but also on the rotation angle of simultaneous meridian rotations, i.e. rotations in horizontal, vertical or oblique planes. Such rotations affect the tilt of the angular eye velocity rotation plane in two ways: First, it reduces the overall tilt angle compared to that predicted by the half-angle rule (compare tilt profiles in Fig. 6A and 8). The second more subtle effect is that it modulates the tilt angle of the rotation plane of the Donders-Listing angular velocity. During smooth tracking, meridian and Donders-Listing rotations of the eye likely alternate in steps just small enough to avoid catch-up saccades (see Fig. 3A). Since meridian rotations do not affect the torsion of the eye, the underlying ocular rotation remains perfectly in accord with the half-angle law of a Listingmotion in the position domain. During saccades on the other hand the meridian rotation of the eye can be large and thus also alter the orientation of the overall angular velocity rotation plane. Threedimensional analyses of strongly curved saccades and even singleaxis rotation saccades support this prediction [29][30][31]. Vestibular angular velocities may also contribute and change the rotation of the eye in any plane. The particular decomposition of the visually dependent total angular eye velocity shown in equation 3 suggests that the Donders-Listing mechanism still remains in control of fine-tuning eye position during visual-vestibular interactions. Angular eye velocity tilts that do not follow the half-angle rule have been reported during interactions of the translational vestibulo-ocular reflex (known to obey Listing's law) and the rotational vestibulo-ocular reflexes [32,33].
Finally, the here presented angular velocity derivation also predicts that the ratio of counter-roll to roll angular velocity should be modulated by the magnitude of the gradient Lg=Lj (equation 6). We have earlier shown that during circular tracking this ratio increases proportional to 1= cos e with target-eccentricity e [18]. During elliptic tracking we found that the angular eye velocity can undershoot this ratio (Fig. 9), indicating that it effectively rolled in the same direction as predicted by equation 6. The successful reconstruction of this effect based on equation 6 corroborates the theoretical assumptions leading to the notion of Donders-Listing rotation operator. Whereas the experimental finding was derived from standard evaluations of the eye position and velocity data, equation 6 was independently derived from the Donders-Listing rotation operator and the thereof derived angular eye velocity.

Neural Implementation
Our analyses and reconstructions are based on the single assumption of an internal model of the desired motion trajectory across the spherical field of fixations. The appropriate motor commands can be envisaged as a series of position and displacement signals guiding the line of sight along the desired trajectory. Based on these requirements, the superior colliculus appears to be the ideal candidate for this kind of neural processing, given the retinal inputs to this structure and its retinotopic topography [34][35][36][37] (for a review of the common functional architecture of the pursuit and saccadic system, see [38,39]). As far as saccades are concerned, neural activity correlated to gradients orthogonal to the classic two-dimensional movement field of collicular cells has not been found [40,41]. However, the novel features proposed here predict only small displacements in the order of about 5u, distributed along the desired trajectory in a discrete and irregular manner. Theoretically, the coordinates and amplitudes of these signals can be deduced from retinal signals. Structures downstream from the superior colliculus, for example the nucleus reticularis tegmenti pontis and subsequent structures in the ponto-cerebellar pathway may be candidates for this kind of processing [42,43]. Another interesting aspect concerns the observed cooperation between piecewise smooth eye movements and minute saccades. This cooperative interaction contributed to the modulation of angular eye velocity, which is however hard to separate in terms of position and velocity because of its non-linear nature.
In conclusion, our analysis supports the notion of Helmholtz that Listing's law serves in a fundamental way to easy visuo-spatial orientation by restricting the visually controlled rotation modes of the eye to two planes at any point of time [2]. Both Donders' and Listing law follow directly from this restriction. The implementation of these laws does not need much computational power beyond processing of retinal signals in the brainstem and low-level visual centers as perhaps best demonstrated by the chameleon [44].

Ethics Statement
The experimental data used in this study were obtained in the context of a larger project requiring three-dimensional eye movement records in subhuman primates. The animals had a chronic acrylic head implant for restraining the head in the experimental sessions. Three-dimensional eye movements were recorded with the magnetic search coil technique using a dual search coil that was implanted on one or both eyes under the conjunctiva as previously described [18,45]. All surgery was performed under aseptic conditions and general anesthesia, and postoperative pain treatment was applied for at least three consecutive days. The animals were housed in groups of three to five individuals in a large room cage (18.5 m 2 ) with access to day light and under the daily supervision of a clinical veterinarian of the Institute of Laboratory Animals of the University of Zurich. The housing was equipped with climbing devices, shielding and primate toys. Single cages for temporarily separating 1-2 animals from the group were 1 m61.5 m61.8 m in width, depth and height (internal dimension 1.5 m 3 ). The animals received a rich diet with daily seasonal fruits and fresh vegetables. These behaviorally well trained animals were used over a number of years for several studies. All experimental procedures were in accordance with the recommendations in the Guide for the Care and Use of Laboratory Animals of the US National Institutes of Health. The housing, husbandry and experimental protocols were reviewed, approved and supervised by the Veterinary Office of the Canton of Zurich.

Experimental Procedures
3D eye movement records were analyzed in a total of six female rhesus monkeys (Macaca mulatta), which had been trained to track a small target light moving along straight (two animals) or curvilinear trajectories (four animals) in visual space (for details see [11,18]). In brief, the animals were seated upright, with the head restrained in a primate chair mounted within an opaque sphere 1.6 m across. A small laser spot (0.35u) was projected onto the inner wall of the surrounding sphere describing linear, circular or elliptic paths on a structured background at a rotation frequency of 0.1 Hz. Linear tracking was tested at eccentricities of 0u, 610u, and 615u relative to straight ahead using oscillation amplitudes of 15u. For elliptic tracking, see below. The quality of smooth tracking was controlled with behavioral windows of 1-2u across. All experiments were performed in dimmed light, i.e. with a background illumination inside the opaque sphere, which completely surrounded the animal. Three-dimensional eye positions were measured using the magnetic search coil technique with an Eye Position Meter 3000 (Skalar, Delft, The Netherlands). Three-dimensional (3D) eye position was calibrated as described in Hess et al. (1992) [28], digitized at a sampling rate of 833.33 Hz, and stored on a computer for off-line analysis. To express eye positions as rotation vectors [46], the zero or reference positions were defined to be the eye's orientations while the monkey fixated a target 0.8 m straight ahead. In four animals, Listing's plane tilted, respectively, less than 4u vertically and 1u horizontally from the frontal plane. In the other two animals, Listing's plane tilted vertically 21.2u and 25u and horizontally 23.7u and 23u. We did not correct eye positions for these deviations from primary position (see Discussion).
Vectors in 3D Euclidean space will be denoted by bold characters. Often we refer to equivalent 1-or 2-vectors, which will be denoted by regular characters. Unit vectors will generally be denoted by regular fonts with caret. When referring explicitly to the components, we write vectors for convenience as row vectors within round parentheses, separating the components by commas.

Encoding 3D Eye Position in Head-fixed Spherical Coordinates
All responses were analyzed cycle per cycle. Saccades, quick phases, and blink artifacts were detected and marked by applying time and amplitude windows to the time derivative of eye acceleration. Cycles with saccades or blink artifacts were eliminated by visual inspection. To facilitate identification of saccadic events in terms of magnitude, duration and peak velocity, eye position traces were rectified by subtracting the sinusoidal modulation determined by least-squares fitting.
Thereby, torsional, vertical and horizontal eye position, denoted E~E tor ,E ver ,E hor ð Þ T (T stands for transpose) was expressed as rotation vector E~tan r=2 ð Þê e whereê e is a unit vector parallel to the axis of rotation, E k k~tan r=2 ð Þ the magnitude and r the angle of rotation. We fitted 3D eye position and angular velocity by the method of minimal least squares with a sum of sinusoids up to the 2 nd second harmonic of the spatial stimulus frequency. We used the scatter-free sinusoidal-fits to compare the predictions of equation 3 and 6 with experimental data (see Figs. 8 and 9). Using these best-fitting eye position functions, we computed the motion of the unit gaze vectorĝ g(t)~g 0 ,g hor ,g ver ð Þ T parallel to the line of sight. Note that g 0~ffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffi 1{g 2 hor {g 2 ver q . We call the plane spanned by the line of sight and the direction straight ahead meridian plane. The angular orientation of the unit gaze vector was defined by the angular eccentricity e relative to straight ahead, e = e(t) and the signed meridian angle y = y(t), subtended by the meridian and sagittal plane, spanned by the head vertical and straight-ahead directions (Figs. 1 to 3). According to the right hand rule, the meridian angle was taken positive in the direction of the curling fingers with the thumb pointing forward, parallel to straight ahead.

Data Analysis
Reconstruction of the listing-motion of the eye based on 3D eye position records. The Listing-motion of the eye was estimated by applying the compound meridian and DL-operator (equation 2) to the unit gaze vector in the spherical field of fixations.
For solving the equations of straight-line tracking in visual space, we assumed the existence of an internal model estimating the target distance and orientation relative to straight ahead in the frontal plane by the two equations p~ffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffi g 2 hor zg 2 ver q and y~tan {1 g hor =g ver ð Þat any instant of time. Thereby, g hor and g ver represented the horizontal and vertical component, respectively, of the unit gaze vector in the spherical field of fixation. The associated target eccentricity followed from the relation . The further procedures were the same as described for the elliptic tracking paradigm in the following paragraphs.
In the elliptic paradigms we first determined the best fit ellipse to the gaze trajectory using the parametric equation p x ð Þ~d,r cos x,r sin x ð Þ T , with d, distance between the observer's eye and the center of the ellipse projected straight-ahead, x, polar angle measured from the major axis, and r~ab= ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi b 2 cos 2 xza 2 sin 2 x q , and a, b the semi-major and semi-minor axes. From these fits we obtained the coordinates of the observer's gaze line when fixating the target: then the eye's angular eccentricity e matches the target's eccentricity relative to the center straight ahead, thus tan e~r=d and the meridian angle y~xzx 0 . We determined e i~e y i ð Þ for each sampling point, starting with e 0~e y 0 ð Þ, the initial fixation position at tracking onset, and ending with e N~e y N ð Þ at the end of the cycle. For each gaze position we estimated the underlying incremental rotation angles j i ,g i ð Þ~y i {y i{1 ,e i {e i{1 ð Þ and the associated gradients Lĝ g i =Ly i and Lĝ g i =Le i . The initial rotation relative to primary position was obtained by applying R 0~I cos g 0 =2 { sin g 0 =2ĉ c M on the unit gaze vectorĝ g in primary position using y,e ð Þ~y 0 ,e 0 ð Þ and j,g We recursively conjugated the unit gaze vectorĝ g i with R i~d R i R i{1 , starting with R 1~R0 (see equations 1 and 2). All tracking was recorded at a frequency of 0.1 Hz. The total number of samples N of one tracking cycle was N = 8333 corresponding to the sampling rate of 833.33 Hz of the experimental data. Reducing the sampling rate down to about 5 Hz had little effect on the quality of the reconstructed ocular rotation (see example in Fig. 4). The angular eye velocity was reconstructed on the basis of the same series of positions and rotation angles that described the time evolution of the unit gaze vector using equations 3 to 5. Thus, the resulting angular eye velocity represented an average angular eye velocity across all the slow phase segment of a given response cycle similar as the reconstructed eye position.
We also computed the generalized R 2 values based on the residual sum of squares of the reconstructed and the reduced model consisting of average eye position or angular velocity. Rootmean square errors were computed by evaluating the expression

Estimation of the Average Gaze Eccentricity per Cycle of Elliptic Tracking
We used elliptic target trajectories with three different eccentricities e~ffi ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffi 1{ b=a ð Þ 2 q = 0.66, 0.87 and 0.97 (a = 20u, semi-major axis; b = 15u, 10u, and 5u, semi-minor axes). To estimate the average gaze eccentricity we applied the parametric equation of the elliptic track, centered straight ahead of the

Ratio of Counter-roll to Roll Angular Velocity
We estimated the ratio of counter-roll to roll angular velocity from eye position and angular velocity records as follows: Because the angular velocity in the coronal plane of the eye must be zero, we estimated the angular velocity in eye-fixed coordinates from the recorded angular velocity by setting the torsion component to zero, i.e. we estimateṼ V e :~Ṽ VD tor~0~0 ,Ṽ V ver ,Ṽ V hor À Á T . Secondly, we estimated the target-induced angular velocity by DṼ V e~ṽ v tor ,0,0 ð Þ T with v v tor~Ẽ E ver dẼ E hor =dt {Ẽ E hor dẼ E ver =dt À Á =Ẽ E 2 ver zẼ E 2 hor À Á obtained from the recorded eye positionẼ E [47]. With these designations the equation for l readsn n D zl lĝ g~f f e , wheren n D represents the direction of the target-induced angular velocity, f f e~Ṽ V e = DṼ V e &1=ṽ v tor 0,Ṽ V ver ,Ṽ V hor À Á T andĝ g is unit gaze vector

Supporting Information
Text S1 Clifford algebra and rotations in 3D Euclidean space.