Automated Tracking of Animal Posture and Movement during Exploration and Sensory Orientation Behaviors

Background The nervous functions of an organism are primarily reflected in the behavior it is capable of. Measuring behavior quantitatively, at high-resolution and in an automated fashion provides valuable information about the underlying neural circuit computation. Accordingly, computer-vision applications for animal tracking are becoming a key complementary toolkit to genetic, molecular and electrophysiological characterization in systems neuroscience. Methodology/Principal Findings We present Sensory Orientation Software (SOS) to measure behavior and infer sensory experience correlates. SOS is a simple and versatile system to track body posture and motion of single animals in two-dimensional environments. In the presence of a sensory landscape, tracking the trajectory of the animal's sensors and its postural evolution provides a quantitative framework to study sensorimotor integration. To illustrate the utility of SOS, we examine the orientation behavior of fruit fly larvae in response to odor, temperature and light gradients. We show that SOS is suitable to carry out high-resolution behavioral tracking for a wide range of organisms including flatworms, fishes and mice. Conclusions/Significance Our work contributes to the growing repertoire of behavioral analysis tools for collecting rich and fine-grained data to draw and test hypothesis about the functioning of the nervous system. By providing open-access to our code and documenting the software design, we aim to encourage the adaptation of SOS by a wide community of non-specialists to their particular model organism and questions of interest.

1. SOS online for behavioral tracking and image preprocessing. 2. SOS offline for high-resolution sensory-motor processing and analysis.
The first module, SOS online, converts frames streamed live from the camera into a sequence of small files capturing the animal posture moving in space and time. It is run by the master function: The second module is run offline and consists of four main scripts: (2.1) loci.m finds loci from the animalʼs posture and skeleton (from SOS online data). (2.1b) locib.m finds head and tail from body curvature (from any existing video frames). (  locib.m reads a sequence of frames from any existing video data (which, therefore, does not need to have been generated with SOS online) and extracts the postural descriptors of the animal. Head and tail loci are found from the animal's curvature, as an alternative or complementary method to the skeletonization procedure implemented in loci.m. merge.m copies and pastes the above data files in a common folder, numbering each trial as single-animal experiment. From then on, every sensorimotor variable is simultaneously processed for all times and for all animals. motion.m runs over all files in the folder generated by merge.m. It outputs a unique file: -motorData.mat: kinematic variables (positions, speeds and angles) and behavioral modes (run, turn, cast) for every time point and every animal behaviorally tested. The exhaustive list of the information saved in this file is available in the screenshots of Figure  15.
sensation.m reads the output data from motion.m and uses the experimental reconstruction of the sensory landscape to generate: -sensoryData.mat: sensory variables (bearing angle, stimulus at the animal's head, etc). The exhaustive list of the information saved in this file is available in the screenshots of Figure 16.

Step-by-step instructions
Before running the software, we recommend to first read the original manuscript and this tutorial. All codes are commented so that one can follow what operation takes place at every line of the script. Now we follow step-by-step and visually illustrate the user-software interactions during the course of an experiment.

(A) Getting started
Once the camera is connected to the computer, it is useful to type imaqhwinfo in the command window to retrieve information about whether it is properly recognized. One should then check whether the dcam driver for firewire cameras is in place so that video object can be assigned to the camera.

(B) Running track.m
Next, set the correct file path and run the routine track.m, specifying the time lapse between frames and number of frames to be acquired. Automatically, a live image of the behavioral arena should be displayed on screen. Make sure the lens cap is removed and the camera diaphragm is not blocking too much or to little light. As illustrated by the next screenshot, follow the prompt instructions and introduce the animal in the arena. As depicted in Figure 1 of the main text, different illumination conditions and threshold values crucially influence animal detection and background reconstruction. Track will call internal subroutines allowing the user to iteratively set up values before tracking starts. As explained in the main text and Figure 2, it is important to appropriately choose the field of view with respect to the animal resolution and the temporal frequency of the tracking.
Before the tracking starts, SOS online displays the image processing for animal detection corresponding to the default settings and allows the user to optimize the threshold.  When the user is satisfied with the current settings, typing "N" fixes and saves the threshold values and SOS online moves on to the following operation.

Figure 5.
Once this is done, SOS automatically reconstructs the background and the software is ready to start tracking. All these steps and the status of the tracker can easily be followed by the instructions displayed at the command window.

(C) Running loci.m
To run loci for offline analysis, please make sure the folder directory and path are correct Figure 6.
First, an image is displayed asking the user to click on specific landmark points.   After landmark and head-tail clicking, the code runs automatically through all frames, displaying the fraction of total frames processed. Figure 10.
When all frames have been processed, an animation of the whole sequence is displayed. Figure 11.
If the animation reveals swaps in the head and tail trajectory, one can run the code again activating the flag to correct for head-tail swaps by reviewing potential problems such as more than two skeleton endpoints (spurs) and small animal aspect ration (blobs). At these frames, head-tail manual annotation is required.

Test dataset
As supplementary material we provide a test data set from real experiments for the user to practice and get familiar with every offline subroutine. Let's assume you performed experiments on larval chemotaxis behavior and the tracking data for every single-animal experiment is in its corresponding folder. SOS can automatically generate high-resolution sensorimotor trajectories. Following the instructions below, one can recreate the data analysis process.