paint-brush
Bringing Ocean Data to Life with DeepSee's 3D Interpolationby@oceanography
New Story

Bringing Ocean Data to Life with DeepSee's 3D Interpolation

tldt arrow

Too Long; Didn't Read

DeepSee’s Interpolation View uses 3D interpolation to map ocean data in latitude, longitude, and depth, visualizing gradients, uncertainties, and core relationships in real time.
featured image - Bringing Ocean Data to Life with DeepSee's 3D Interpolation
Oceanography: Everything You Need to Study the Ocean HackerNoon profile picture
0-item

Abstract and 1 Introduction

2 Related Work

3 Methodology

4 Studying Deep Ocean Ecosystem and 4.1 Deep Ocean Research Goals

4.2 Workflow and Data

4.3 Design Challenges and User Tasks

5 The DeepSea System

  • 5.1 Map View
  • 5.2 Core View

5.3 Interpolation View and 5.4 Implementation

6 Usage Scenarios and 6.1 Scenario: Pre-Cruise Planning

  • 6.2 Scenario: On-the-Fly Decision-Making

7 Evaluation and 7.1 Cruise Deployment

7.2 Expert Interviews

7.3 Limitations

7.4 Lessons Learned

8 Conclusions and Future Work, Acknowledgments, and References

5.3 Interpolation View

he Interpolation View (Fig. 5) visualizes sample-level (parameter) data by interpolating values between cores in three dimensions (latitude, longitude, and depth) and rendering the resulting data as a three-dimensional object colored by the parameter value at each point in space. Our visualization technique can extend to any 3D interpolation method that provides four attributes (𝑥, 𝑦, 𝑧, and a value) for each point observation. The interpolations can be run in real time, allowing the scientists to rapidly investigate different predictions about the environment below the seafloor where future sampling might yield the greatest return on investment.


DeepSee renders 3D interpolation data as space-filling volumetric pixels (voxels) in a 3D space. For example, at an 𝑥 cm grid size, we represent an interpolated value at a given latitude/longitude/depth as a rectangular prism with width/length of 𝑥 cm and a height of 1 cm, matching the same depth horizon measurements in real cores. We currently support fast 3D linear barycentric interpolation and a 3D discretized approximation of Sibson’s natural-neighbors interpolation [39] (T11). While the research scientists expressed concerns that linear and natural-neighbors gradients are unlikely to reflect spatially heterogeneous in-situ environmental processes, they were optimistic about the capabilities of visualizing these methods as a proof-of-concept. Linear interpolations offer a simple explanatory model with little complexity and no ability to interpolate outside of sampled regions. Natural-neighbors is a smoother approximation of nearest-neighbors interpolation with a space-filling property; i.e., outside the space between observations, values will decay at the rate of the gradient at the boundary. Additionally, the grid sizes offered were specifically chosen to mirror the real life size of cores (7 cm diameter) (T7). Because interpolations are computationally


Figure 5: The Interpolation View visualizes parameter values interpolated in three dimensions (latitude/longitude/depth) between cores. We provide several reconfiguration interactions (A) to help users make sense of an unfamiliar data representation. Users can update the view on the fly in several ways: swapping between standard and Value-Suppressing Color Palettes (VSUPs) [11] (B); changing the interpolation method and/or grid size (C); and clipping through the interpolation (D). Finally, users can select an interpolated core to export as JSON (E).


expensive, especially at 7 cm resolution across tens to hundreds of meters, we provide larger grid sizes in 7 cm increments.


We used the built-in camera system with OrbitControls from Three.js, the JavaScript library we used to build the Interpolation View (Sect. 5.4). Using a mouse, users can left click and drag to freely rotate the voxels with a stationary camera angle, right click and drag to pan the camera in the latitude/longitude plane, and scroll the mouse wheel to zoom in and out. We built additional features that allow users to reconfigure the view in several ways (T7) using buttons (Fig. 5A): (1) extending the height of each voxel from 1 cm to 10 cm, making patterns in depth much easier to see; (2) showing / hiding all interpolated values and only showing actual sample data; and (3) changing voxels from rectangular prisms to cylinders at a fixed 7 cm diameter, mirroring the shape of real life cores. These visual aids helped orient new users quickly to this projected visualization method.


The research scientists also expressed an interest in seeing where interpolations were less “certain”. In response, we implemented variations on the six color palettes from the Core View based on ValueSuppressing Uncertainty Palettes (VSUPs) [11] (Fig. 5B). Where color palettes apply a color to a given voxel based on a single parameter value, VSUPs take in an additional uncertainty parameter and suppress the color as a function of the amount of uncertainty in the value (T11). In DeepSee, we normalized the linear distances from each voxel to the nearest non-interpolated voxel as uncertainty; thus, interpolated values farther from a known sample value will appear suppressed, signaling a lack of confidence in the interpolation at that location. Similar to the interpolation methods, our choice of uncertainty measure (distance to nearest sample) worked as a proof of concept for how to integrate more scientifically accurate measures in the future. The color palette can be changed on the fly while preserving the camera angle and visual embellishments.


We provide several on-the-fly capabilities to help users explore interpolations while maintaining spatial context. Users can change the interpolation method and grid size (Fig. 5C) while the camera angle and visual embellishments persist, helping users maintain context and orientation while immediately seeing changes in gradients (T7). For example, a scientist could quickly switch between interpolations of taxonomic and geochemical attributes at various grid sizes to identify relationships between them and find locations with interesting gradients to sample in future dives (T1, T2, T8). Users can also clip through the voxels (Fig. 5D) to see patterns in the data in the interior of the interpolated region. We provide two clipping methods: (1) clipping individual voxels by their interpolation value outside of a range chosen by the user using a double-ended range slider; and (2) clipping by latitude/longitude/depth using a single-ended range slider. In (2), cuts are taken only from the intersection of all three clipping planes, and we also ensure users can flip the clipping plane to view cuts from any direction. Finally, if a user is interested in the exact parameter values at a specific latitude/longitude, they can hover over the object and select a vertical core to save (Fig. 5E) (T8). The results are output in JSON as a list of parameter values at each 1 cm horizon, matching the exact format of a real core sample. These results can then be further analyzed, e.g., in future dives by collecting new cores, deriving sample values, and comparing them with the interpolated values at the same latitude/longitude given by DeepSee (T9).


Figure 6: Two usage scenarios showing how DeepSee helps users make the most of limited sample data by supporting real-time annotation and interpolation in the tool, with data from Speth et al. [45]. In (A), our user maximized information between map and tabular data to determine where high-value cores containing both ANME-2c and JS1 are most likely to occur on future dives. In (B), during an on-the-fly decision-making scenario where the seafloor changed over time, the Interpolation View created a data-driven opportunity to correct course by interpolating unseen sulfide values and helping users find new targets.

5.4 Implementation

DeepSee is an open-source4[] Vue.js Single-Page Application, which provides the UI framework and styling. We plot cores on top of PNG map backgrounds in the Map View as well as draw bar charts in the Core View using D3.js [6]. To compute interpolations, we used SciPy’s ND piecewise linear barycentric interpolator [46] and a Python implementation of a 3D discretized approximation for Sibson’s natural-neighbors interpolation [39]. Then, we used Three.js to create a traditional 3D render scene with global lighting in the Interpolation View. We also wanted DeepSee to be accessible while doing fieldwork in remote locations. Thus, we used Electron.js to create a portable web browser environment as a standalone desktop executable for both Windows and Mac operating systems that can run with or without access to the internet.


This paper is available on arxiv under CC BY 4.0 DEED license.


[4] DeepSee code: https://github.com/orphanlab/DeepSee


Authors:

(1) Adam Coscia, Georgia Institute of Technology, Atlanta, Georgia, USA (acoscia6@gatech.edu);

(2) Haley M. Sapers, Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, California, USA (hsapers@caltech.edu);

(3) Noah Deutsch, Harvard University Cambridge, Massachusetts, USA (ndeutsch@mde.harvard.edu);

(4) Malika Khurana, The New York Times Company, New York, New York, USA (malika.khurana@nytimes.com);

(5) John S. Magyar, Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA (jmagyar@caltech.edu);

(6) Sergio A. Parra, Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA (sparra@caltech.edu);

(7) Daniel R. Utter, rwipfler@caltech.edu Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA (dutter@caltech.edu);

(8) John S. Magyar, Division of Geological and Planetary Sciences, California Institute of Technology Pasadena, California, USA (rwipfler@caltech.edu);

(9) David W. Caress, Monterey Bay Aquarium Research Institute, Moss Landing, California, USA (caress@mbari.org);

(10) Eric J. Martin Jennifer B. Paduan Monterey Bay Aquarium Research Institute, Moss Landing, California, USA (emartin@mbari.org);

(11) Jennifer B. Paduan, Monterey Bay Aquarium Research Institute, Moss Landing, California, USA (paje@mbari.org);

(12) Maggie Hendrie, ArtCenter College of Design, Pasadena, California, USA (maggie.hendrie@artcenter.edu);

(13) Santiago Lombeyda, California Institute of Technology, Pasadena, California, USA (santiago@caltech.edu);

(14) Hillary Mushkin, California Institute of Technology, Pasadena, California, USA (hmushkin@caltech.edu);

(15) Alex Endert, Georgia Institute of Technology, Atlanta, Georgia, USA (endert@gatech.edu);

(16) Scott Davidoff, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California, USA (scott.davidoff@jpl.nasa.gov);

(17) Victoria J. Orphan, Division of Geological and Planetary Sciences, California Institute of Technology, Pasadena, California, USA (vorphan@caltech.edu).