Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.14160
Cited By
Efficient and Robust Jet Tagging at the LHC with Knowledge Distillation
23 November 2023
Ryan Liu
A. Gandrakota
J. Ngadiuba
M. Spiropulu
J. Vlimant
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Efficient and Robust Jet Tagging at the LHC with Knowledge Distillation"
8 / 8 papers shown
Title
Lorentz Group Equivariant Neural Network for Particle Physics
A. Bogatskiy
Brandon M. Anderson
Jan T. Offermann
M. Roussi
David W. Miller
Risi Kondor
AI4CE
68
141
0
08 Jun 2020
Transferring Inductive Biases through Knowledge Distillation
Samira Abnar
Mostafa Dehghani
Willem H. Zuidema
71
59
0
31 May 2020
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Victor Sanh
Lysandre Debut
Julien Chaumond
Thomas Wolf
237
7,547
0
02 Oct 2019
TinyBERT: Distilling BERT for Natural Language Understanding
Xiaoqi Jiao
Yichun Yin
Lifeng Shang
Xin Jiang
Xiao Chen
Linlin Li
F. Wang
Qun Liu
VLM
109
1,869
0
23 Sep 2019
Energy Flow Networks: Deep Sets for Particle Jets
Patrick T. Komiske
E. Metodiev
Jesse Thaler
PINN
3DPC
88
258
0
11 Oct 2018
Graph Attention Networks
Petar Velickovic
Guillem Cucurull
Arantxa Casanova
Adriana Romero
Pietro Lio
Yoshua Bengio
GNN
481
20,225
0
30 Oct 2017
Empirical Evaluation of Rectified Activations in Convolutional Network
Bing Xu
Naiyan Wang
Tianqi Chen
Mu Li
140
2,913
0
05 May 2015
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Sergey Ioffe
Christian Szegedy
OOD
465
43,341
0
11 Feb 2015
1