Skip to main content

Research Repository

Advanced Search

Denoising Diffusion Probabilistic Models for Styled Walking Synthesis

Findlay, Edmund; Zhang, Haozheng; Chang, Ziyi; Shum, Hubert P.H.

Denoising Diffusion Probabilistic Models for Styled Walking Synthesis Thumbnail


Authors

Edmund Findlay

Haozheng Zhang haozheng.zhang@durham.ac.uk
PGR Student Doctor of Philosophy

Ziyi Chang ziyi.chang@durham.ac.uk
PGR Student Doctor of Philosophy



Abstract

Generating realistic motions for digital humans is time-consuming for many graphics applications. Data-driven motion synthesis approaches have seen solid progress in recent years through deep generative models. These results offer high-quality motions but typically suffer in motion style diversity. For the first time, we propose a framework using the denoising diffusion probabilistic model (DDPM) to synthesize styled human motions, integrating two tasks into one pipeline with increased style diversity compared with traditional motion synthesis methods. Experimental results show that our system can generate high-quality and diverse walking motions.

Citation

Findlay, E., Zhang, H., Chang, Z., & Shum, H. P. (2022, November). Denoising Diffusion Probabilistic Models for Styled Walking Synthesis. Presented at MIG 2022: The 15th Annual ACM SIGGRAPH Conference on Motion, Interaction and Games, Guanajuato, Mexico

Presentation Conference Type Conference Paper (published)
Conference Name MIG 2022: The 15th Annual ACM SIGGRAPH Conference on Motion, Interaction and Games
Start Date Nov 3, 2022
End Date Nov 5, 2022
Acceptance Date Sep 16, 2022
Publication Date 2022
Deposit Date Oct 4, 2022
Publicly Available Date Oct 5, 2022
Publisher Association for Computing Machinery (ACM)
ISBN 9781450398886
DOI https://doi.org/10.1145/3561975
Public URL https://durham-repository.worktribe.com/output/1135582
Publisher URL https://dl.acm.org/conference/mig

Files






You might also like



Downloadable Citations