ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.10656
  4. Cited By
More From Less: Self-Supervised Knowledge Distillation for Routine
  Histopathology Data

More From Less: Self-Supervised Knowledge Distillation for Routine Histopathology Data

19 March 2023
Lucas Farndale
R. Insall
Ke Yuan
ArXivPDFHTML

Papers citing "More From Less: Self-Supervised Knowledge Distillation for Routine Histopathology Data"

5 / 5 papers shown
Title
Variance Covariance Regularization Enforces Pairwise Independence in
  Self-Supervised Representations
Variance Covariance Regularization Enforces Pairwise Independence in Self-Supervised Representations
Grégoire Mialon
Randall Balestriero
Yann LeCun
29
9
0
29 Sep 2022
Resolution-Based Distillation for Efficient Histology Image
  Classification
Resolution-Based Distillation for Efficient Histology Image Classification
Joseph DiPalma
A. Suriawinata
L. Tafe
Lorenzo Torresani
Saeed Hassanpour
35
35
0
11 Jan 2021
CrossTransformers: spatially-aware few-shot transfer
CrossTransformers: spatially-aware few-shot transfer
Carl Doersch
Ankush Gupta
Andrew Zisserman
ViT
212
330
0
22 Jul 2020
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,567
0
17 Apr 2017
Xception: Deep Learning with Depthwise Separable Convolutions
Xception: Deep Learning with Depthwise Separable Convolutions
François Chollet
MDE
BDL
PINN
206
14,367
0
07 Oct 2016
1