Fig 1.
A flow chart depicting the steps taken when processing each frame of the video.
The left column contains a description of each step, while the center column lists the corresponding OpenCvSharp function, and the right column shows the mathematical formula applied during that step.
Fig 2.
The Motion Tracker software interface in action.
The video being processed is displayed in the top left panel, while the corresponding motion silhouette is shown at the top right. The lower left panel displays a moving graph of the motion index over time. Controls for the software, which allow the user to select which video to process, whether to show the visualizations, and where to save output files, are located in the lower right panel.
Fig 3.
Sample output of the motion tracking algorithm.
On the left are single frames extracted from a video sequence, while the panels on the right display the corresponding motion silhouettes. Pixels that have been displaced (i.e., places in the video frame where motion has occurred) are shown in white; pixels that have not been displaced are shown in black.
Fig 4.
The overall experimental setup for the first validation study.
The participant sat on a chair with the BPMS seat and pack pads, facing a computer monitor. The camera recorded the participant’s upper torso and face.
Fig 5.
These graphs display a 500 timestep segment of a sample motion time series.
The top graph shows motion in individual frames as the proportion of changed pixels per frame, while the bottom graph shows the absolute difference of the proportion of changed pixels per frame across consecutive frames, i.e., the change in motion across adjacent frames. Periods of stable motion in the top graph are reflected by small spikes in the absolute difference graph, i.e., small changes in motion across adjacent frames. Sharp increases or decreases in motion are reflected by larger spikes, indicative of larger changes in motion across adjacent frames.
Fig 6.
Sample output from the BPMS pressure pads.
On the left is a pressure map from the seat pressure pad. Each square in the map corresponds to a single sensing element. On the right are graphs showing a 200 timestep segment of a sample mean pressure time series for the back pressure pad (top) and for the seat pressure pad (bottom). Changes in pressure against the seat and back pads are reflected in the spikes and dips in the mean pressure graphs.
Fig 7.
Scatter plots showing the mean of the absolute difference of each time series (as z-scores) vs. each other time series.
The top panel includes all data while the bottom panel eliminates participants with negligible back movement.
Table 1.
Means and standard deviations (in parentheses) of cross-correlations among windowed time series.
Fig 8.
Line graphs of the mean cross-correlation across all participants of the mean of the absolute difference of each time series (z-scores), with each time series divided into 10 windows.
The top graph shows the motion and seat over time, while the bottom graph shows the motion and back over time.
Table 2.
Means and standard deviations (in parentheses) of cross-correlations among windowed time series.
Fig 9.
Scatter plots showing the mean of the absolute difference of the estimated movement vs. the accelerometer time series (as z-scores), for each of the three actions performed by subjects: (1) right arm swipe to the left, (2) right arm swipe to the right, and (3) right hand wave.
Fig 10.
Line graph of the cross-correlation of the absolute difference of the estimated movement and the Kinect head position time series (z-scores), with each time series divided into 10 windows.