PedestalPredictor ONNX bundles

Five ONNX encapsulations of the same shared PedestalModel architecture (MSE + FPE encoders) trained on DIII-D shot data. Each subdirectory is a fully self-contained bundle: ONNX graphs plus normalization, target, and provenance sidecars.

Quick start (recommended: PedestalEnsemble wrapper)

from inference.ensemble import PedestalEnsemble

ens = PedestalEnsemble.from_huggingface(
    "SCS-Lab/pedestal-predictor-onnx"
)
out = ens.predict_one(
    history_stats=...,     # (50, 458) float32; v1 bundles slice to 446
    history_masks=...,     # (50,)
    aux_features=...,      # (3,)
    sequences_raw=...,     # (T, 32) — raw physical units
    signal_masks=...,      # (32,)
)
print(out.te_ped, out.ti_ped, out.t_rot_ped,
      out.edens_ped, out.is_h_mode, out.h_mode_prob)

The wrapper ships in the PedestalPredictor GitHub repo. It loads all five bundles via manifest.json at this repo's root, applies per-bundle FPE normalization from the raw physical-unit inputs, runs the appropriate MSE-history width (446 vs 458) per bundle, and returns a typed dataclass with all five predictions.

Quick start (advanced: raw per-bundle ONNX)

from huggingface_hub import snapshot_download
import onnxruntime as ort, json

local = snapshot_download(repo_id="SCS-Lab/pedestal-predictor-onnx",
                          allow_patterns=["te_ped_89/*"])
mse = ort.InferenceSession(f"{local}/te_ped_89/mse_encoder.onnx")
fpe = ort.InferenceSession(f"{local}/te_ped_89/fpe_encoder.onnx")
cfg = json.load(open(f"{local}/te_ped_89/model_config.json"))
# ... feed MSE history + FPE sequences; see te_ped_89/README.md

Bundles

Bundle Task Target MSE history FPE dim Notes
hmode_89 classification hmode 446 32 threshold=0.5
te_ped_89 regression te_ped (keV) 458 32 μ=0.516, σ=0.410
ti_ped_89 regression ti_ped (keV) 458 32 μ=0.902, σ=0.654
t_rot_ped_89 regression t_rot_ped (krad/s) 458 32 μ=17.190, σ=14.376
edensfit89 regression edens_ped 446 32 μ=2.580, σ=1.606

manifest.json

The root-level manifest.json lists every bundle's dataset_version, task, target, default threshold, and sidecar file list. The PedestalEnsemble wrapper reads this manifest as its bootstrap contract; direct consumers of the ONNX graphs can use it to auto-discover new bundles.

Provenance

Each bundle's provenance.json records:

  • bundle_name, task_type, target_name, dataset_version
  • source_trial_dir and checkpoint path on the training cluster
  • torch_version, onnx_version, opset_version
  • git_sha of the export-time commit in the GitHub repo
  • fpe_normalization_source (+ sha256), target_norm_source (+ sha256)

Breaking path change

Pre-monorepo publishes put mse_encoder.onnx and fpe_encoder.onnx at the repo root. They now live under edensfit89/. Update any direct hf_hub_download calls accordingly; see the bottom of edensfit89/README.md for the migration snippet.

License

All bundles: APACHE 2.0.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support