From Images to Hydrologic Networks-Understanding the Arctic Landscape with Graphs

Tabea Rettelbach, Moritz Langer, Ingmar Nitze, Benjamin M. Jones, Veit Helm, Johann Christoph Freytag, Guido Grosse

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

Abstract

Remote sensing-based Earth Observation plays an important role in assessing environmental changes throughout our planet. As an image-heavy domain, the evaluation of the data strongly focuses on statistical and pixel-based spatial analysis methods. However, considering the complexity of our Earth system, there are some environmental structures and dependencies that are not possible to accurately describe with these traditional image analysis approaches. One example for such a limitation is the representation of (spatial) networks and their characteristics. In this study, we thus propose a computer vision approach that enables the representation of semantic information gained from images as graphs. As an example, we investigate digital terrain models of Arctic permafrost landscapes with its very characteristic polygonal patterned ground. These regular patterns, which are clearly visible in high-resolution image and elevation data, are formed by subsurface ice bodies that are very vulnerable to rising temperatures in a warming Arctic. Observing these networks' topologies and metrics in space and time with graph analysis thus allows insights into the landscape's complex geomorphology, hydrology, and ecology and therefore helps to quantify how they interact with climate change. We show that results extracted with this analytical and highly automated approach are in line with those gathered from other manual studies or from manual validation. Thus, with this approach, we introduce a method that, for the first time, enables upscaling of such terrain and network analysis to potentially pan-Arctic scales where collecting in-situ field data is strongly limited.

Original languageEnglish
Title of host publicationScientific and Statistical Database Management - 34th International Conference, SSDBM 2022 - Proceedings
EditorsElaheh Pourabbas, Yongluan Zhou, Yuchen Li, Bin Yang
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450396677
DOIs
Publication statusPublished - 6 Jul 2022
Externally publishedYes
Event34th International Conference on Scientific and Statistical Database Management, SSDBM 2022 - Copenhagen, Denmark
Duration: 6 Jul 20228 Jul 2022

Publication series

NameACM International Conference Proceeding Series

Conference

Conference34th International Conference on Scientific and Statistical Database Management, SSDBM 2022
Country/TerritoryDenmark
CityCopenhagen
Period6/07/228/07/22

Bibliographical note

Funding Information:
We would like to begin by acknowledging and thanking the Iñu-piat, on whose traditional territory we were able to collect data and conduct research for this manuscript. Financially, this work was supported by Geo.X, the Research Network for Geosciences in Berlin and Potsdam. T.R. acknowledges the support by the Helmholtz Einstein International Berlin Research School in Data Science (HEIB-RiDS). B.M.J. was supported by a grant from the US National Science Foundation (NSF OIA-1929170). We acknowledge support by the Open Access Publication Funds of the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research. We further thank Martin Gehrmann, Maximilian Stöhr, Matthias Gessner, Torsten Sachs, and the Kenn Borek Air pilot crew, who supported data acquisition during the airborne flight campaign with AWI’s Polar-5 research airplane. Many thanks also go to Brian Groenke for his helpful comments and to Blake Robert Mills for introducing the MetBrewer color palette.

Publisher Copyright:
© 2022 Owner/Author.

Keywords

  • computer vision
  • digital terrain models
  • graph analysis
  • spatial data

Fingerprint

Dive into the research topics of 'From Images to Hydrologic Networks-Understanding the Arctic Landscape with Graphs'. Together they form a unique fingerprint.

Cite this