ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1708.00214
17
38

Natural Language Processing with Small Feed-Forward Networks

1 August 2017
Jan A. Botha
Emily Pitler
Ji Ma
A. Bakalov
Alexandru Salcianu
David J. Weiss
Ryan T. McDonald
Slav Petrov
    HAI
ArXivPDFHTML
Abstract

We show that small and shallow feed-forward neural networks can achieve near state-of-the-art results on a range of unstructured and structured language processing tasks while being considerably cheaper in memory and computational requirements than deep recurrent models. Motivated by resource-constrained environments like mobile phones, we showcase simple techniques for obtaining such small neural network models, and investigate different tradeoffs when deciding how to allocate a small memory budget.

View on arXiv
Comments on this paper