ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.02039
  4. Cited By
Power Law Graph Transformer for Machine Translation and Representation
  Learning

Power Law Graph Transformer for Machine Translation and Representation Learning

27 June 2021
B. Gokden
ArXivPDFHTML

Papers citing "Power Law Graph Transformer for Machine Translation and Representation Learning"

5 / 5 papers shown
Title
PLDR-LLMs Learn A Generalizable Tensor Operator That Can Replace Its Own Deep Neural Net At Inference
PLDR-LLMs Learn A Generalizable Tensor Operator That Can Replace Its Own Deep Neural Net At Inference
Burc Gokden
57
0
0
19 Feb 2025
PLDR-LLM: Large Language Model from Power Law Decoder Representations
PLDR-LLM: Large Language Model from Power Law Decoder Representations
Burc Gokden
31
1
0
22 Oct 2024
Google's Neural Machine Translation System: Bridging the Gap between
  Human and Machine Translation
Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation
Yonghui Wu
M. Schuster
Zhehuai Chen
Quoc V. Le
Mohammad Norouzi
...
Alex Rudnick
Oriol Vinyals
G. Corrado
Macduff Hughes
J. Dean
AIMat
718
6,750
0
26 Sep 2016
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
223
7,930
0
17 Aug 2015
Efficient Estimation of Word Representations in Vector Space
Efficient Estimation of Word Representations in Vector Space
Tomas Mikolov
Kai Chen
G. Corrado
J. Dean
3DV
322
31,297
0
16 Jan 2013
1