ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.01135
  4. Cited By
Arch-Net: Model Distillation for Architecture Agnostic Model Deployment

Arch-Net: Model Distillation for Architecture Agnostic Model Deployment

1 November 2021
Weixin Xu
Zipeng Feng
Shuangkang Fang
Song Yuan
Yi Yang
Shuchang Zhou
    MQ
ArXivPDFHTML

Papers citing "Arch-Net: Model Distillation for Architecture Agnostic Model Deployment"

4 / 4 papers shown
Title
One is All: Bridging the Gap Between Neural Radiance Fields
  Architectures with Progressive Volume Distillation
One is All: Bridging the Gap Between Neural Radiance Fields Architectures with Progressive Volume Distillation
Shuangkang Fang
Weixin Xu
Heng Wang
Yi Yang
Yu-feng Wang
Shuchang Zhou
34
15
0
29 Nov 2022
Diversifying Sample Generation for Accurate Data-Free Quantization
Diversifying Sample Generation for Accurate Data-Free Quantization
Xiangguo Zhang
Haotong Qin
Yifu Ding
Ruihao Gong
Qing Yan
Renshuai Tao
Yuhang Li
F. Yu
Xianglong Liu
MQ
56
94
0
01 Mar 2021
RepVGG: Making VGG-style ConvNets Great Again
RepVGG: Making VGG-style ConvNets Great Again
Xiaohan Ding
Xinming Zhang
Ningning Ma
Jungong Han
Guiguang Ding
Jian Sun
136
1,549
0
11 Jan 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,572
0
17 Apr 2017
1