TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/PBNS: Physically Based Neural Simulator for Unsupervised G...

PBNS: Physically Based Neural Simulator for Unsupervised Garment Pose Space Deformation

Hugo Bertiche, Meysam Madadi, Sergio Escalera

2020-12-21Physical Simulations
PaperPDFCode(official)

Abstract

We present a methodology to automatically obtain Pose Space Deformation (PSD) basis for rigged garments through deep learning. Classical approaches rely on Physically Based Simulations (PBS) to animate clothes. These are general solutions that, given a sufficiently fine-grained discretization of space and time, can achieve highly realistic results. However, they are computationally expensive and any scene modification prompts the need of re-simulation. Linear Blend Skinning (LBS) with PSD offers a lightweight alternative to PBS, though, it needs huge volumes of data to learn proper PSD. We propose using deep learning, formulated as an implicit PBS, to unsupervisedly learn realistic cloth Pose Space Deformations in a constrained scenario: dressed humans. Furthermore, we show it is possible to train these models in an amount of time comparable to a PBS of a few sequences. To the best of our knowledge, we are the first to propose a neural simulator for cloth. While deep-based approaches in the domain are becoming a trend, these are data-hungry models. Moreover, authors often propose complex formulations to better learn wrinkles from PBS data. Supervised learning leads to physically inconsistent predictions that require collision solving to be used. Also, dependency on PBS data limits the scalability of these solutions, while their formulation hinders its applicability and compatibility. By proposing an unsupervised methodology to learn PSD for LBS models (3D animation standard), we overcome both of these drawbacks. Results obtained show cloth-consistency in the animated garments and meaningful pose-dependant folds and wrinkles. Our solution is extremely efficient, handles multiple layers of cloth, allows unsupervised outfit resizing and can be easily applied to any custom 3D avatar.

Results

TaskDatasetMetricValueModel
Physical Simulations4D-DRESSChamfer (cm)1.885PBNS_Lower
Physical Simulations4D-DRESSStretching Energy0.107PBNS_Lower
Physical Simulations4D-DRESSChamfer (cm)2.687PBNS_Upper
Physical Simulations4D-DRESSStretching Energy0.04PBNS_Upper
Physical Simulations4D-DRESSChamfer (cm)4.859PBNS_Outer
Physical Simulations4D-DRESSStretching Energy0.107PBNS_Outer
Physical Simulations4D-DRESSChamfer (cm)4.869PBNS_Dress
Physical Simulations4D-DRESSStretching Energy0.643PBNS_Dress

Related Papers

Dynamic Diffusion Schrödinger Bridge in Astrophysical Observational Inversions2025-06-09AMR-Transformer: Enabling Efficient Long-range Interaction for Complex Neural Fluid Simulation2025-03-13DecoupledGaussian: Object-Scene Decoupling for Physics-Based Interaction2025-03-07Erwin: A Tree-based Hierarchical Transformer for Large-scale Physical Systems2025-02-24Grounding Creativity in Physics: A Brief Survey of Physical Priors in AIGC2025-02-10Transfer learning in Scalable Graph Neural Network for Improved Physical Simulation2025-02-07Multi-Physics Simulations via Coupled Fourier Neural Operator2025-01-28PhysicsGen: Can Generative Models Learn from Images to Predict Complex Physical Relations?2025-01-01