ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.14736
  4. Cited By
Speech2Properties2Gestures: Gesture-Property Prediction as a Tool for
  Generating Representational Gestures from Speech
v1v2 (latest)

Speech2Properties2Gestures: Gesture-Property Prediction as a Tool for Generating Representational Gestures from Speech

28 June 2021
Taras Kucherenko
Rajmund Nagy
Patrik Jonell
Michael Neff
Hedvig Kjellström
G. Henter
ArXiv (abs)PDFHTML

Papers citing "Speech2Properties2Gestures: Gesture-Property Prediction as a Tool for Generating Representational Gestures from Speech"

10 / 10 papers shown
Title
Multimodal analysis of the predictability of hand-gesture properties
Multimodal analysis of the predictability of hand-gesture properties
Taras Kucherenko
Rajmund Nagy
Michael Neff
Hedvig Kjellström
G. Henter
45
22
0
12 Aug 2021
A large, crowdsourced evaluation of gesture generation systems on common
  data: The GENEA Challenge 2020
A large, crowdsourced evaluation of gesture generation systems on common data: The GENEA Challenge 2020
Taras Kucherenko
Patrik Jonell
Youngwoo Yoon
Pieter Wolfert
G. Henter
61
75
0
23 Feb 2021
Understanding the Predictability of Gesture Parameters from Speech and
  their Perceptual Importance
Understanding the Predictability of Gesture Parameters from Speech and their Perceptual Importance
Ylva Ferstl
Michael Neff
R. Mcdonnell
SLR
31
16
0
02 Oct 2020
Speech Gesture Generation from the Trimodal Context of Text, Audio, and
  Speaker Identity
Speech Gesture Generation from the Trimodal Context of Text, Audio, and Speaker Identity
Youngwoo Yoon
Bok Cha
Joo-Haeng Lee
Minsu Jang
Jaeyeon Lee
Jaehong Kim
Geehyuk Lee
51
283
0
04 Sep 2020
Sequence-to-Sequence Predictive Model: From Prosody To Communicative
  Gestures
Sequence-to-Sequence Predictive Model: From Prosody To Communicative Gestures
Fajrian Yunus
Chloé Clavel
Catherine Pelachaud
SLR
25
16
0
17 Aug 2020
Moving fast and slow: Analysis of representations and post-processing in
  speech-driven automatic gesture generation
Moving fast and slow: Analysis of representations and post-processing in speech-driven automatic gesture generation
Taras Kucherenko
Dai Hasegawa
Naoshi Kaneko
G. Henter
Hedvig Kjellström
70
41
0
16 Jul 2020
Gesticulator: A framework for semantically-aware speech-driven gesture
  generation
Gesticulator: A framework for semantically-aware speech-driven gesture generation
Taras Kucherenko
Patrik Jonell
S. V. Waveren
G. Henter
Simon Alexanderson
Iolanda Leite
Hedvig Kjellström
SLR
54
180
0
25 Jan 2020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and
  lighter
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
255
7,547
0
02 Oct 2019
MoGlow: Probabilistic and controllable motion synthesis using
  normalising flows
MoGlow: Probabilistic and controllable motion synthesis using normalising flows
G. Henter
Simon Alexanderson
Jonas Beskow
68
98
0
16 May 2019
Robots Learn Social Skills: End-to-End Learning of Co-Speech Gesture
  Generation for Humanoid Robots
Robots Learn Social Skills: End-to-End Learning of Co-Speech Gesture Generation for Humanoid Robots
Youngwoo Yoon
Woo-Ri Ko
Minsu Jang
Jaeyeon Lee
Jaehong Kim
Geehyuk Lee
SLR
53
231
0
30 Oct 2018
1