ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.00262
  4. Cited By
Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with
  Generative Adversarial Affective Expression Learning

Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with Generative Adversarial Affective Expression Learning

31 July 2021
Uttaran Bhattacharya
Elizabeth Childs
Nicholas Rewkowski
Tianyi Zhou
    SLR
    GAN
ArXivPDFHTML

Papers citing "Speech2AffectiveGestures: Synthesizing Co-Speech Gestures with Generative Adversarial Affective Expression Learning"

1 / 51 papers shown
Title
BEAT: A Large-Scale Semantic and Emotional Multi-Modal Dataset for
  Conversational Gestures Synthesis
BEAT: A Large-Scale Semantic and Emotional Multi-Modal Dataset for Conversational Gestures Synthesis
Haiyang Liu
Zihao Zhu
Naoya Iwamoto
Yichen Peng
Zhengqing Li
You Zhou
E. Bozkurt
Bo Zheng
SLR
CVBM
31
138
0
10 Mar 2022
Previous
12