ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1606.01700
  4. Cited By
Gated Word-Character Recurrent Language Model

Gated Word-Character Recurrent Language Model

6 June 2016
Yasumasa Onoe
Kyunghyun Cho
    RALM
    KELM
ArXivPDFHTML

Papers citing "Gated Word-Character Recurrent Language Model"

15 / 15 papers shown
Title
An Overview on Language Models: Recent Developments and Outlook
An Overview on Language Models: Recent Developments and Outlook
Chengwei Wei
Yun Cheng Wang
Bin Wang
C.-C. Jay Kuo
35
42
0
10 Mar 2023
LiteLSTM Architecture Based on Weights Sharing for Recurrent Neural
  Networks
LiteLSTM Architecture Based on Weights Sharing for Recurrent Neural Networks
Nelly Elsayed
Zag ElSayed
Anthony Maida
32
0
0
12 Jan 2023
LiteLSTM Architecture for Deep Recurrent Neural Networks
LiteLSTM Architecture for Deep Recurrent Neural Networks
Nelly Elsayed
Zag ElSayed
Anthony Maida
40
5
0
27 Jan 2022
Recurrent Neural Network from Adder's Perspective: Carry-lookahead RNN
Recurrent Neural Network from Adder's Perspective: Carry-lookahead RNN
Haowei Jiang
Fei-wei Qin
Jin Cao
Yong Peng
Yanli Shao
LRM
ODL
16
42
0
22 Jun 2021
COCO_TS Dataset: Pixel-level Annotations Based on Weak Supervision for
  Scene Text Segmentation
COCO_TS Dataset: Pixel-level Annotations Based on Weak Supervision for Scene Text Segmentation
Andrea Zugarini
S. Melacci
Monica Bianchini
Marco Maggini
13
37
0
01 Apr 2019
Trellis Networks for Sequence Modeling
Trellis Networks for Sequence Modeling
Shaojie Bai
J. Zico Kolter
V. Koltun
25
145
0
15 Oct 2018
Character-Aware Decoder for Translation into Morphologically Rich
  Languages
Character-Aware Decoder for Translation into Morphologically Rich Languages
Adithya Renduchintala
Pamela Shapiro
Kevin Duh
Philipp Koehn
AI4CE
16
4
0
06 Sep 2018
Effective Character-augmented Word Embedding for Machine Reading
  Comprehension
Effective Character-augmented Word Embedding for Machine Reading Comprehension
ZhuoSheng Zhang
Yafang Huang
Peng Fei Zhu
Hai Zhao
RALM
18
17
0
07 Aug 2018
Subword-augmented Embedding for Cloze Reading Comprehension
Subword-augmented Embedding for Cloze Reading Comprehension
ZhuoSheng Zhang
Yafang Huang
Zhao Hai
RALM
27
32
0
24 Jun 2018
Numeracy for Language Models: Evaluating and Improving their Ability to
  Predict Numbers
Numeracy for Language Models: Evaluating and Improving their Ability to Predict Numbers
Georgios P. Spithourakis
Sebastian Riedel
30
81
0
21 May 2018
Dynamic Meta-Embeddings for Improved Sentence Representations
Dynamic Meta-Embeddings for Improved Sentence Representations
Douwe Kiela
Changhan Wang
Kyunghyun Cho
AI4TS
28
108
0
21 Apr 2018
An Empirical Evaluation of Generic Convolutional and Recurrent Networks
  for Sequence Modeling
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
Shaojie Bai
J. Zico Kolter
V. Koltun
DRL
42
4,731
0
04 Mar 2018
Attending to Characters in Neural Sequence Labeling Models
Attending to Characters in Neural Sequence Labeling Models
Marek Rei
Gamal K. O. Crichton
S. Pyysalo
46
187
0
14 Nov 2016
Words or Characters? Fine-grained Gating for Reading Comprehension
Words or Characters? Fine-grained Gating for Reading Comprehension
Zhilin Yang
Bhuwan Dhingra
Ye Yuan
Junjie Hu
William W. Cohen
Ruslan Salakhutdinov
AI4CE
27
100
0
06 Nov 2016
Using the Output Embedding to Improve Language Models
Using the Output Embedding to Improve Language Models
Ofir Press
Lior Wolf
27
728
0
20 Aug 2016
1