Fig 1.
Browser-based exploration and sharing of trajectory visualisations with linus.
(a) Control workflow of linus. (Input data) linus can import tracking data from a variety of formats. (Preprocessing) The Python-converter additionally enriches the imported data with additional features (providing e.g. an edge-bundled version of the data, visual context, or a coordinate system) and prepares the visualisation packet. (Tour setup) The user can open the visualisation in a web browser and create an interactive presentation of the data. (Sharing) These visualisations can be shared via a URL, or a QR code and (Exploring) readily presented and explored across various devices. (b) Overview of the graphical user interface (GUI). The data can be visualised and explored in the browser. Different aspects of the data can be interactively highlighted (zoomed example on the right shows the effect of changing the degree of trajectory bundling).
Fig 2.
Configurable filters allow deep data exploration.
The user can choose from a range of several visualisation methods directly in the browser interface to highlight aspects of interest in the data (zebrafish tracking results from [2] as an example). (a) The line data is visualised using a range of options for shading and colour mapping. (b-c) From the full dataset (top), the user can filter parts of the data concerning specific attributes, such as time intervals (bottom) or (c) a specific range of signals (marker expression in cells in this case). (d) The user can further create subselections of the tracks in space using cutting planes or refinable spatial selections. The visual attributes can be defined separately for the selected focus region and the non-selected context region. (e-g) The web interface can blend seamlessly between different states of the data. This feature can be used to map between (e) original tracks and their edge-bundled version, to visualise planar projections of the 3D data (f) locally on a definable (oblique) plane or (g) globally using a Mercator projection (with definable parameters).
Fig 3.
Sharable interactive visualisation packets for a multitude of applications ranging across a variety of sciences.
The user can combine the visualisation methods, annotations, and camera motion paths in a scheduled tour that can be shared by a custom URL or QR code generated directly in the browser interface. Panels (a)-(d) demonstrate use cases for real-world datasets with different characteristics and dimensionality. (a) Ant trails (2D+t) from [18]. Bundling and colour-coding (spatial orientation by mapping (x,y,z) to (R,G,B) values) indicate the major trails running in opposing directions. (b) GPS Animal tracking data for two species (blue whales [19]—blue and arctic tern [20]—red) shown on a Mercator projection of the earth’s surface. For a better orientation, the outline of the continents is included as axes into the visualisation that dynamically adapt to the projections and viewpoint changes (2D surface data + t). (c) Brain tractography data showing major white matter connectivity from diffusion MRI (3D). The spatial selection highlights the left hemisphere, while anatomical context is provided by the outline of the entire brain (from mesh data) and the defocused tracts of the right hemisphere. (d) Cell movements during the elongation process of zebrafish blastoderm explants (3D+t) [21]. Bundling, colour coding, and spatial selection highlight collective cell movements as the explant starts elongating, focusing on a subpopulation of cells driving this process. The colour code shows time from early (yellow) to late (red) for selected tracks.