ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.01557
  4. Cited By
Dynamically Context-Sensitive Time-Decay Attention for Dialogue Modeling
v1v2 (latest)

Dynamically Context-Sensitive Time-Decay Attention for Dialogue Modeling

5 September 2018
Shang-Yu Su
Pei-Chieh Yuan
Yun-Nung Chen
ArXiv (abs)PDFHTML

Papers citing "Dynamically Context-Sensitive Time-Decay Attention for Dialogue Modeling"

2 / 2 papers shown
Title
Dual Supervised Learning for Natural Language Understanding and
  Generation
Dual Supervised Learning for Natural Language Understanding and Generation
Shang-Yu Su
Chao-Wei Huang
Yun-Nung Chen
146
38
0
15 May 2019
Decay-Function-Free Time-Aware Attention to Context and Speaker
  Indicator for Spoken Language Understanding
Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding
Jonggu Kim
Jong-Hyeok Lee
8
6
0
20 Mar 2019
1