[Paper] Physics-Conditioned Synthesis of Internal Ice-Layer Thickness for Incomplete Layer Traces

Published: (April 22, 2026 at 01:10 PM EDT)
5 min read
Source: arXiv

Source: arXiv - 2604.20783v1

Overview

Radar surveys of polar ice sheets produce internal layer traces that are crucial for reconstructing past snowfall and ice flow. However, these traces are often riddled with gaps—segments disappear, whole layers vanish—because of sensor noise, limited resolution, and signal loss. The paper by Liu & Rahnemoonfar tackles this “layer‑completion” problem head‑on: they train a neural network that fills in missing ice‑layer thicknesses by leveraging both the geometry of the observed radar data and auxiliary physical climate information.

Key Contributions

  • Physics‑conditioned synthesis: Introduces a model that jointly consumes radar‑derived geometry and climate‑model features (e.g., temperature, accumulation rates) to generate plausible thickness values for missing layers.
  • Hybrid geometric‑temporal architecture: Combines a graph‑based geometric encoder (for intra‑layer spatial context) with a transformer‑style temporal module (for inter‑layer consistency across the stratigraphic stack).
  • Mask‑aware robust regression loss: Designs a loss that evaluates errors only on observed thickness entries, automatically normalizing for varying sparsity and avoiding any ad‑hoc imputation.
  • Preservation‑first inference: Guarantees that any measured thickness is left untouched; the network only predicts where data are absent, yielding seamless reconstructions of fragmented or entirely missing layers.
  • Downstream pre‑training benefit: Shows that the synthesized complete thickness stacks can be used to pre‑train a deep‑layer predictor, leading to measurable accuracy gains when fine‑tuned on fully traced data.

Methodology

  1. Data preparation

    • Radar‑derived layer traces are represented as a set of partially observed thickness vectors (one per layer) aligned along the flight line.
    • Physical climate model outputs (e.g., surface temperature, precipitation, modeled accumulation) are sampled at the same spatial locations and concatenated as auxiliary features.
  2. Model architecture

    • Geometric encoder: A graph neural network (GNN) treats each radar trace point as a node, connecting neighboring points within the same layer. This captures local spatial patterns such as curvature or roughness.
    • Temporal transformer: The GNN‑produced node embeddings are fed into a transformer that attends across the layer dimension, allowing information from well‑observed shallow layers to influence deeper, missing ones. Positional encodings encode the depth index, encouraging monotonic thickness evolution.
  3. Training objective

    • A mask‑aware robust regression loss computes the L1/L2 error only on entries where ground‑truth thickness is present. The loss is divided by the count of valid entries per sample, preventing bias toward densely observed regions.
    • An optional physics regularizer penalizes deviations from known physical relationships (e.g., thickness should increase with cumulative snowfall).
  4. Inference

    • The trained network receives the incomplete thickness map plus climate features and outputs a full thickness stack. Observed values are copied directly; only missing entries are replaced by the model’s predictions.

Results & Findings

MetricIncomplete baseline (nearest‑neighbor)Proposed modelGap‑filled downstream predictor
Mean Absolute Error (MAE) on held‑out observed points0.42 m0.21 m0.18 m (after pre‑training)
Structural similarity of reconstructed stratigraphy (SSIM)0.710.89
Percentage of fully recovered missing layers27 %63 %
  • The model dramatically reduces error on the observed points, confirming that the mask‑aware loss does not corrupt known data.
  • Reconstructed thickness profiles exhibit smooth, physically realistic evolution across depth, as visualized in several case studies from Greenland and Antarctica.
  • Using the synthesized complete stacks to pre‑train a deep‑layer thickness predictor yields a ~10 % relative improvement over training from scratch on the same fully traced dataset.

Practical Implications

  • Accelerated ice‑sheet modeling: Researchers can now generate dense thickness histories from sparse radar surveys, feeding richer inputs into ice‑flow simulators and improving sea‑level rise projections.
  • Cost‑effective field campaigns: Survey teams can accept lower‑resolution or noisier radar passes, knowing that a post‑processing step can reliably fill gaps, reducing flight time and operational expenses.
  • Data‑fusion pipelines: The physics‑conditioned approach demonstrates a blueprint for blending remote‑sensing observations with climate model outputs—a pattern that can be replicated for other geoscience variables (e.g., subsurface water content, permafrost depth).
  • Machine‑learning pre‑training: The synthetic thickness stacks serve as a large, self‑supervised dataset, enabling developers to pre‑train models for related tasks (e.g., layer classification, anomaly detection) without the need for exhaustive manual annotation.

Limitations & Future Work

  • Dependence on climate model fidelity: The quality of the synthesized thicknesses is tied to the accuracy of the auxiliary physical features; systematic biases in the climate model could propagate into the reconstructions.
  • Sparse extreme gaps: While the method recovers many missing layers, performance degrades when entire large depth intervals lack any radar return (e.g., due to deep attenuation).
  • Scalability to global datasets: Training on continent‑scale radar mosaics will require distributed training strategies and memory‑efficient graph representations.
  • Future directions: The authors suggest integrating uncertainty quantification (e.g., Bayesian layers) to flag low‑confidence predictions, exploring multimodal inputs (e.g., laser altimetry), and extending the framework to jointly predict other stratigraphic attributes such as impurity layers or melt‑water channels.

Authors

  • Zesheng Liu
  • Maryam Rahnemoonfar

Paper Information

  • arXiv ID: 2604.20783v1
  • Categories: cs.LG
  • Published: April 22, 2026
  • PDF: Download PDF
0 views
Back to Blog

Related posts

Read more »