ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2010.11450
  4. Cited By
Optimal Approximation -- Smoothness Tradeoffs for Soft-Max Functions

Optimal Approximation -- Smoothness Tradeoffs for Soft-Max Functions

22 October 2020
Alessandro Epasto
Mohammad Mahdian
Vahab Mirrokni
Manolis Zampetakis
ArXiv (abs)PDFHTML

Papers citing "Optimal Approximation -- Smoothness Tradeoffs for Soft-Max Functions"

6 / 6 papers shown
Title
On Controllable Sparse Alternatives to Softmax
On Controllable Sparse Alternatives to Softmax
Anirban Laha
Saneem A. Chemmengath
Priyanka Agrawal
Mitesh M. Khapra
Karthik Sankaranarayanan
H. G. Ramaswamy
63
59
0
29 Oct 2018
Learning Multi-item Auctions with (or without) Samples
Learning Multi-item Auctions with (or without) Samples
Yang Cai
C. Daskalakis
59
68
0
01 Sep 2017
From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label
  Classification
From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label Classification
André F. T. Martins
Ramón Fernández Astudillo
184
726
0
05 Feb 2016
An Exploration of Softmax Alternatives Belonging to the Spherical Loss
  Family
An Exploration of Softmax Alternatives Belonging to the Spherical Loss Family
A. D. Brébisson
Pascal Vincent
61
98
0
16 Nov 2015
Efficient Exact Gradient Update for training Deep Networks with Very
  Large Sparse Targets
Efficient Exact Gradient Update for training Deep Networks with Very Large Sparse Targets
Pascal Vincent
A. D. Brébisson
Xavier Bouthillier
75
49
0
22 Dec 2014
Distributed Representations of Words and Phrases and their
  Compositionality
Distributed Representations of Words and Phrases and their Compositionality
Tomas Mikolov
Ilya Sutskever
Kai Chen
G. Corrado
J. Dean
NAIOCL
402
33,565
0
16 Oct 2013
1