Improving Multi-Task Deep Neural Networks via Knowledge Distillation for
  Natural Language Understanding

Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding

Papers citing "Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding"

16 / 16 papers shown
Title
Layer Normalization
Layer Normalization
175
10,412
0
21 Jul 2016

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.