Skip to content

Pranavsingh431/Enhancing-PINN-Based-Inverse-Modeling-of-the-Nonlinear-Pendulum

Repository files navigation

Physics-Informed Neural Networks with Passivity Constraints

Enhancing inverse parameter identification in nonlinear dynamical systems using Physics-Informed Neural Networks (PINNs), thermodynamic passivity constraints, and bootstrap ensemble uncertainty quantification.

This repository presents a complete experimental pipeline for investigating stability, identifiability, and calibration behavior in PINN-based inverse modeling of the nonlinear pendulum.

Highlights

  • Inverse parameter recovery under sparse velocity observations (100 measurements)
  • Passivity constraints enforcing physically consistent energy dissipation
  • 25-model bootstrap ensembles for uncertainty quantification
  • 20,000 training epochs per model
  • Robustness grids and calibration analysis
  • Fully reproducible experimental pipeline

Research Questions Investigated

  • Does passivity regularization stabilize inverse parameter recovery?
  • Can bootstrap ensembles provide calibrated uncertainty estimates in PINNs?
  • How severe is damping parameter non-identifiability under sparse observations?

πŸ“Š Key Results

Parameter Estimation (Noisy Case, Οƒ=0.01)

Method g error L error c error Trajectory RMSE Energy Drift
Standard PINN 0.04% 22.9% 1032% 0.327 0.00199
Passivity PINN 2.12% 9.06% 696% 0.327 0.00058
Ensemble (25) 1.44% 13.3% 768% 0.329 0.00061

Key Findings

  • Passivity improves conservative parameters (g, L) by stabilizing estimates
  • Damping estimation is catastrophically ill-posed (700–2100% errors) β€” a fundamental identifiability challenge under sparse observations
  • Ensemble UQ is severely miscalibrated (8.7% coverage vs. 95% target)
  • Bias dominates over variance β€” systematic errors are the primary failure mode

πŸ—‚οΈ Repository Structure

pinn_passivity_paper/
β”œβ”€β”€ src/                          # Main source code
β”‚   β”œβ”€β”€ data/
β”‚   β”‚   β”œβ”€β”€ generator.py          # Data generation (analytical + nonlinear solvers)
β”‚   β”‚   └── utils.py              # Time grids, noise, batching
β”‚   β”œβ”€β”€ baseline/
β”‚   β”‚   β”œβ”€β”€ linear_small_angle.py # Analytical solutions
β”‚   β”‚   β”œβ”€β”€ nonlinear_rk.py       # RK4 and solve_ivp
β”‚   β”‚   └── plots_baseline.py     # Baseline plotting
β”‚   β”œβ”€β”€ models/
β”‚   β”‚   β”œβ”€β”€ pinn_inverse.py       # PINN architecture with Fourier features
β”‚   β”‚   β”œβ”€β”€ losses.py             # Physics, IC, passivity losses
β”‚   β”‚   β”œβ”€β”€ train_inverse.py      # Training loop with TensorBoard
β”‚   β”‚   β”œβ”€β”€ dissipation_net.py    # NN for nonparametric damping
β”‚   β”‚   └── ensemble.py           # Bootstrap ensemble implementation
β”‚   β”œβ”€β”€ analysis/
β”‚   β”‚   β”œβ”€β”€ metrics.py            # RMSE, energy drift, coverage, ECE
β”‚   β”‚   └── tables_figs.py        # Figure/table generators
β”‚   β”œβ”€β”€ experiments/
β”‚   β”‚   β”œβ”€β”€ exp_baseline.py       # Baseline experiments
β”‚   β”‚   β”œβ”€β”€ exp_inverse_single.py # Single PINN runs
β”‚   β”‚   β”œβ”€β”€ exp_inverse_ens.py    # Ensemble experiments
β”‚   β”‚   └── grids.py              # Robustness study grids
β”‚   β”œβ”€β”€ configs/
β”‚   β”‚   β”œβ”€β”€ default.yaml          # All hyperparameters
β”‚   β”‚   └── config_loader.py      # Config management
β”‚   └── viz/
β”‚       └── style.py              # Publication-quality plotting
β”œβ”€β”€ scripts/
β”‚   β”œβ”€β”€ run_all.sh                # Full pipeline orchestration
β”‚   └── generate_final_study.py  # Automated analysis report
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ test_data.py
β”‚   β”œβ”€β”€ test_losses.py
β”‚   β”œβ”€β”€ test_models.py
β”‚   └── test_metrics.py
β”œβ”€β”€ outputs/                      # Generated results
β”‚   β”œβ”€β”€ baseline/
β”‚   β”œβ”€β”€ inverse_single/
β”‚   β”œβ”€β”€ ensemble/
β”‚   β”œβ”€β”€ summaries/
β”‚   β”œβ”€β”€ FINAL_STUDY.md
β”‚   └── *.csv, *.png, *.json
β”œβ”€β”€ report/
β”‚   β”œβ”€β”€ COMPLETE_REPORT.tex       # Full LaTeX report (~40 pages)
β”‚   β”œβ”€β”€ figures/
β”‚   └── *.md
β”œβ”€β”€ requirements.txt
β”œβ”€β”€ pyproject.toml
β”œβ”€β”€ Makefile
└── README.md

πŸš€ Quick Start

1. Install Dependencies

python3 -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate
pip install -r requirements.txt

Required packages: torch, numpy, scipy, matplotlib, pandas, tqdm, tensorboard, pyyaml, pytest, ruff, black

2. Run the Full Pipeline

bash scripts/run_all.sh --full

# Or step-by-step:
make setup
make test
make run-baseline
make run-inverse
make run-ensemble

3. View Results

tensorboard --logdir outputs/
cat outputs/FINAL_STUDY.md
open outputs/analysis/*.png

πŸ§ͺ Experiments

Baseline

python -m src.experiments.exp_baseline

Single Inverse PINN

python -m src.experiments.exp_inverse_single \
    --n-epochs 20000 \
    --n-sparse 100 \
    --noise 0.01 \
    --use-velocity-obs true

Ensemble UQ

python -m src.experiments.exp_inverse_ens \
    --n-models 25 \
    --n-epochs 20000 \
    --use-passivity true

Robustness Grid

python -m src.experiments.run_grid --full

πŸ”¬ Configuration

Edit configs/default.yaml:

physics:
  g: 9.81
  L: 1.0
  c: 0.05

time:
  t_start: 0.0
  t_end: 10.0
  n_points_dense: 1000
  n_points_sparse: 100

model:
  hidden_dims: [32, 32, 32]
  activation: tanh
  n_fourier_features: 6

training:
  n_epochs: 20000
  lr: 0.001
  optimizer: adam

loss_weights:
  data: 1.0
  velocity: 1.0
  physics: 10.0
  ic: 1.0
  passivity: 1.0

ensemble:
  n_models: 25
  bootstrap: true

🎯 Reproducibility

Seed: 1337 (fixed throughout). All experiments use fixed random seeds, deterministic algorithms, logged hyperparameters, and saved checkpoints.

export PYTHONHASHSEED=1337
bash scripts/run_all.sh --full

πŸ§ͺ Testing

pytest                         # All tests
pytest tests/test_losses.py -v # Specific module
pytest --cov=src tests/        # With coverage

πŸ“ Report

cd report/
pdflatex COMPLETE_REPORT.tex
pdflatex COMPLETE_REPORT.tex

The report contains 40+ pages of analysis, 20+ figures, 15+ result tables, and full mathematical derivations. Compatible with Overleaf.


πŸ“š References

  1. Raissi et al. (2019) β€” Physics-Informed Neural Networks, Journal of Computational Physics
  2. Karniadakis et al. (2021) β€” Physics-Informed Machine Learning, Nature Reviews Physics
  3. Wang et al. (2021) β€” Gradient Pathologies in PINNs, SIAM Journal on Scientific Computing
  4. Yang et al. (2021) β€” Bayesian PINNs, Journal of Computational Physics

πŸ“„ Academic Context

This work was completed as part of MA-515 at IIT Ropar.

Authors: Pranav Singh Β· Prashant Singh Β· Nishit Soni Β· Jaskaran Singh Β· Ishwar Sanjay Β· Harshdeep

@techreport{singh2024pinn,
  title={Enhancing PINN-Based Inverse Modeling of the Nonlinear Pendulum
         Using Passivity Constraints and Ensemble UQ},
  author={Singh, Pranav and Singh, Prashant and Soni, Nishit and
          Singh, Jaskaran and Sanjay, Ishwar and Harshdeep},
  institution={Indian Institute of Technology Ropar},
  year={2024},
  type={Course Project Report},
  number={MA-515}
}

Last Updated: November 2025

About

Physics-Informed Neural Networks with passivity constraints and ensemble uncertainty quantification for nonlinear inverse modeling.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors