Selected Work

A few representative projects showing how I design, implement, and validate ML systems—especially for inference, uncertainty, and real-world noise.

Score-Based Likelihood Characterization (SLIC)

NeurIPS 2023 · Robust inference with non-Gaussian noise

  • Built a score-based framework to model complex likelihoods directly from data (diffusion / score networks).
  • Validated on real telescope imaging affected by structured artifacts; enables calibrated posterior inference when Gaussian noise assumptions fail.
  • Diffusion / Score Models
  • Bayesian Inference
  • Posterior Calibration

Artifact removal in real JWST imaging

TEMPLATES Collaboration · Deep learning for structured noise

  • Developed ML models to remove striped noise artifacts from real JWST images of strongly lensed galaxies.
  • Focused on high-fidelity recovery for downstream scientific measurement (not just visual denoising).
  • Image Restoration
  • Robust Evaluation

Selection-bias correction for population inference

ICML 2022 · Hierarchical inference with neural selection modeling

  • Addressed survey selection bias when ML models are used as discovery pipelines.
  • Developed an unbiased population-level inference method via hierarchical modeling + learned selection correction.
  • Hierarchical Bayes
  • Bias Correction
  • Density Estimation

Simulation-based inference pipelines (SBI)

NeurIPS 2021 · Density estimation for fast posteriors

  • Built SBI workflows to produce full posterior distributions over complex physical parameters, without restrictive analytic likelihood assumptions.
  • Designed evaluation & diagnostics to quantify uncertainty and failure modes.
  • Amortized Inference
  • Uncertainty Quantification
  • Neural Density Estimation

ML surrogate modeling for turbulence

Neural networks for subgrid-scale physics

  • Trained neural models to approximate unresolved turbulent physics (subgrid stress tensor) with high accuracy (~99.5% on benchmark evaluations).
  • Goal: enable faster and more reliable simulation at reduced resolution for downstream forecasting/optimization.
  • Surrogate Modeling
  • Scientific ML
  • Regression

Engineering: pipeline acceleration

HERA collaboration · 6× speedup in official pipeline components

  • Implemented optimized codepaths that accelerated an anomaly-detection workflow; evaluated and later integrated in the collaboration pipeline.
  • Experience with profiling, algorithmic optimization, and making research code production-ready for teams.
  • Performance
  • Reproducibility

About

I’m a PhD candidate at Université de Montréal working at the intersection of machine learning, probabilistic modeling, and scientific inference. I enjoy problems where the hard part is getting uncertainty right: distribution shift, non-Gaussian noise, selection effects, and high-dimensional posteriors.

What I’m looking for

  • Roles: ML Engineer, Applied Scientist, Data Scientist (applied ML).
  • Teams that care about evaluation, calibration, and real-world robustness.
  • Start: Summer/Fall 2026 (flexible).

Core strengths

  • Diffusion / score models for inference & generative modeling.
  • Simulation-based inference (density estimation, amortization, diagnostics).
  • Uncertainty quantification: calibration, posterior checks, failure-mode analysis.
  • Data + engineering: reproducible research, profiling/optimization, collaboration codebases.

Toolbox

  • ML: PyTorch, neural density estimation, diffusion/score models
  • Data: NumPy, SciPy, pandas, visualization & reporting
  • Software: Git, Linux, experiment tracking (as needed)

Highlights

  • 7-month international research internship at the Flatiron Institute (CCA), Cosmology × ML group.
  • NSERC CGS-D doctoral scholarship recipient (2023).
  • Reviewer for ICML/NeurIPS and MILA PhD admissions committee.

Publications

A short, curated list (full list available on request / CV).

  • NeurIPS 2023: Score-Based Likelihood Characterization for Inverse Problems in the Presence of Non-Gaussian Noise.
  • ICML 2022: Population-Level Inference of Strong Gravitational Lenses with Neural Network-Based Selection Correction.
  • NeurIPS 2021: Simulation-Based Inference of Strong Gravitational Lensing Parameters.
  • MNRAS Letters 2023: Posterior Sampling of the Initial Conditions of the Universe using Score-Based Generative Models.
  • ApJ Letters 2023: Beyond Gaussian Noise: A Generalized Approach to Likelihood Analysis with non-Gaussian Noise.
  • JOSS 2024: Caustics: A Python Package for Accelerated Strong Gravitational Lensing Simulations.

Contact

Best way to reach me: Email or LinkedIn

Based in Montréal, Canada. Open to remote or relocation depending on role.