ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.03230
  4. Cited By
Barlow Twins: Self-Supervised Learning via Redundancy Reduction

Barlow Twins: Self-Supervised Learning via Redundancy Reduction

4 March 2021
Jure Zbontar
Li Jing
Ishan Misra
Yann LeCun
Stéphane Deny
    SSL
ArXivPDFHTML

Papers citing "Barlow Twins: Self-Supervised Learning via Redundancy Reduction"

50 / 1,364 papers shown
Title
SelfCF: A Simple Framework for Self-supervised Collaborative Filtering
SelfCF: A Simple Framework for Self-supervised Collaborative Filtering
Xin Zhou
Aixin Sun
Yong-jin Liu
Jie M. Zhang
C. Miao
SSL
25
76
0
07 Jul 2021
Do Different Tracking Tasks Require Different Appearance Models?
Do Different Tracking Tasks Require Different Appearance Models?
Zhongdao Wang
Hengshuang Zhao
Yali Li
Shengjin Wang
Philip H. S. Torr
Luca Bertinetto
34
81
0
05 Jul 2021
A data-centric approach for improving ambiguous labels with combined
  semi-supervised classification and clustering
A data-centric approach for improving ambiguous labels with combined semi-supervised classification and clustering
Lars Schmarje
M. Santarossa
Simon-Martin Schroder
Claudius Zelenka
R. Kiko
J. Stracke
N. Volkmann
Reinhard Koch
30
10
0
30 Jun 2021
Leveraging Hidden Structure in Self-Supervised Learning
Leveraging Hidden Structure in Self-Supervised Learning
Emanuele Sansone
SSL
20
0
0
30 Jun 2021
Exploring Localization for Self-supervised Fine-grained Contrastive
  Learning
Exploring Localization for Self-supervised Fine-grained Contrastive Learning
Di Wu
Siyuan Li
Z. Zang
Stan Z. Li
SSL
24
8
0
30 Jun 2021
SCARF: Self-Supervised Contrastive Learning using Random Feature
  Corruption
SCARF: Self-Supervised Contrastive Learning using Random Feature Corruption
Dara Bahri
Heinrich Jiang
Yi Tay
Donald Metzler
SSL
17
163
0
29 Jun 2021
Intrinsically Motivated Self-supervised Learning in Reinforcement
  Learning
Intrinsically Motivated Self-supervised Learning in Reinforcement Learning
Yue Zhao
Chenzhuang Du
Hang Zhao
Tiejun Li
SSL
11
4
0
26 Jun 2021
From Canonical Correlation Analysis to Self-supervised Graph Neural
  Networks
From Canonical Correlation Analysis to Self-supervised Graph Neural Networks
Hengrui Zhang
Qitian Wu
Junchi Yan
David Wipf
Philip S. Yu
SSL
22
210
0
23 Jun 2021
Can contrastive learning avoid shortcut solutions?
Can contrastive learning avoid shortcut solutions?
Joshua Robinson
Li Sun
Ke Yu
Kayhan Batmanghelich
Stefanie Jegelka
S. Sra
SSL
19
141
0
21 Jun 2021
Lossy Compression for Lossless Prediction
Lossy Compression for Lossless Prediction
Yann Dubois
Benjamin Bloem-Reddy
Karen Ullrich
Chris J. Maddison
18
59
0
21 Jun 2021
Self-Supervised Learning with Kernel Dependence Maximization
Self-Supervised Learning with Kernel Dependence Maximization
Yazhe Li
Roman Pogodin
Danica J. Sutherland
A. Gretton
SSL
19
77
0
15 Jun 2021
Understanding Latent Correlation-Based Multiview Learning and
  Self-Supervision: An Identifiability Perspective
Understanding Latent Correlation-Based Multiview Learning and Self-Supervision: An Identifiability Perspective
Qinjie Lyu
Xiao Fu
Weiran Wang
Songtao Lu
SSL
15
29
0
14 Jun 2021
I Don't Need u: Identifiable Non-Linear ICA Without Side Information
I Don't Need u: Identifiable Non-Linear ICA Without Side Information
M. Willetts
Brooks Paige
CML
OOD
8
22
0
09 Jun 2021
Self-Supervised Learning with Data Augmentations Provably Isolates
  Content from Style
Self-Supervised Learning with Data Augmentations Provably Isolates Content from Style
Julius von Kügelgen
Yash Sharma
Luigi Gresele
Wieland Brendel
Bernhard Schölkopf
M. Besserve
Francesco Locatello
17
302
0
08 Jun 2021
Interpretable agent communication from scratch (with a generic visual
  processor emerging on the side)
Interpretable agent communication from scratch (with a generic visual processor emerging on the side)
Roberto Dessì
Eugene Kharitonov
Marco Baroni
27
27
0
08 Jun 2021
Provable Guarantees for Self-Supervised Deep Learning with Spectral
  Contrastive Loss
Provable Guarantees for Self-Supervised Deep Learning with Spectral Contrastive Loss
Jeff Z. HaoChen
Colin Wei
Adrien Gaidon
Tengyu Ma
SSL
11
301
0
08 Jun 2021
Efficient Training of Visual Transformers with Small Datasets
Efficient Training of Visual Transformers with Small Datasets
Yahui Liu
E. Sangineto
Wei Bi
N. Sebe
Bruno Lepri
Marco De Nadai
ViT
28
164
0
07 Jun 2021
Large-scale Unsupervised Semantic Segmentation
Large-scale Unsupervised Semantic Segmentation
Shangqi Gao
Zhong-Yu Li
Ming-Hsuan Yang
Mingg-Ming Cheng
Junwei Han
Philip H. S. Torr
UQCV
38
84
0
06 Jun 2021
Aligning Pretraining for Detection via Object-Level Contrastive Learning
Aligning Pretraining for Detection via Object-Level Contrastive Learning
Fangyun Wei
Yue Gao
Zhirong Wu
Han Hu
Stephen Lin
ObjD
14
144
0
04 Jun 2021
Graph Barlow Twins: A self-supervised representation learning framework
  for graphs
Graph Barlow Twins: A self-supervised representation learning framework for graphs
Piotr Bielak
Tomasz Kajdanowicz
Nitesh V. Chawla
SSL
11
133
0
04 Jun 2021
Learning to Draw: Emergent Communication through Sketching
Learning to Draw: Emergent Communication through Sketching
Daniela Mihai
Jonathon S. Hare
29
25
0
03 Jun 2021
Connecting Language and Vision for Natural Language-Based Vehicle
  Retrieval
Connecting Language and Vision for Natural Language-Based Vehicle Retrieval
Shuai Bai
Zhedong Zheng
Xiaohan Wang
Junyang Lin
Zhu Zhang
Chang Zhou
Yi Yang
Hongxia Yang
16
27
0
31 May 2021
Self-supervised Detransformation Autoencoder for Representation Learning
  in Open Set Recognition
Self-supervised Detransformation Autoencoder for Representation Learning in Open Set Recognition
Jingyun Jia
P. Chan
ViT
6
5
0
28 May 2021
GraphVICRegHSIC: Towards improved self-supervised representation
  learning for graphs with a hyrbid loss function
GraphVICRegHSIC: Towards improved self-supervised representation learning for graphs with a hyrbid loss function
Sayan Nag
SSL
22
0
0
25 May 2021
Unsupervised Visual Representation Learning by Online Constrained
  K-Means
Unsupervised Visual Representation Learning by Online Constrained K-Means
Qi Qian
Yuanhong Xu
Juhua Hu
Hao Li
R. L. Jin
CML
SSL
10
33
0
24 May 2021
Do We Really Need to Learn Representations from In-domain Data for
  Outlier Detection?
Do We Really Need to Learn Representations from In-domain Data for Outlier Detection?
Zhisheng Xiao
Qing Yan
Y. Amit
OOD
UQCV
12
18
0
19 May 2021
Self-Supervised Learning for Fine-Grained Visual Categorization
Self-Supervised Learning for Fine-Grained Visual Categorization
Muhammad Maaz
H. Rasheed
D. Gaddam
17
2
0
18 May 2021
Divide and Contrast: Self-supervised Learning from Uncurated Data
Divide and Contrast: Self-supervised Learning from Uncurated Data
Yonglong Tian
Olivier J. Hénaff
Aaron van den Oord
SSL
51
96
0
17 May 2021
Self-supervised Learning on Graphs: Contrastive, Generative,or
  Predictive
Self-supervised Learning on Graphs: Contrastive, Generative,or Predictive
Lirong Wu
Haitao Lin
Zhangyang Gao
Cheng Tan
Stan.Z.Li
SSL
26
241
0
16 May 2021
Sparsity-Probe: Analysis tool for Deep Learning Models
Sparsity-Probe: Analysis tool for Deep Learning Models
Ido Ben-Shaul
S. Dekel
16
4
0
14 May 2021
Using Self-Supervised Auxiliary Tasks to Improve Fine-Grained Facial
  Representation
Using Self-Supervised Auxiliary Tasks to Improve Fine-Grained Facial Representation
Mahdi Pourmirzaei
G. Montazer
Farzaneh Esmaili
CVBM
16
26
0
13 May 2021
VICReg: Variance-Invariance-Covariance Regularization for
  Self-Supervised Learning
VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning
Adrien Bardes
Jean Ponce
Yann LeCun
SSL
DML
53
900
0
11 May 2021
DEEMD: Drug Efficacy Estimation against SARS-CoV-2 based on cell
  Morphology with Deep multiple instance learning
DEEMD: Drug Efficacy Estimation against SARS-CoV-2 based on cell Morphology with Deep multiple instance learning
M. Saberian
Kathleen P. Moriarty
A. Olmstead
Christian Hallgrimson
Franccois Jean
I. Nabi
Maxwell W. Libbrecht
Ghassan Hamarneh
23
12
0
10 May 2021
Contrastive Attraction and Contrastive Repulsion for Representation
  Learning
Contrastive Attraction and Contrastive Repulsion for Representation Learning
Huangjie Zheng
Xu Chen
Jiangchao Yao
Hongxia Yang
Chunyuan Li
Ya-Qin Zhang
Hao Zhang
Ivor Tsang
Jingren Zhou
Mingyuan Zhou
SSL
36
12
0
08 May 2021
On Feature Decorrelation in Self-Supervised Learning
On Feature Decorrelation in Self-Supervised Learning
Tianyu Hua
Wenxiao Wang
Zihui Xue
Sucheng Ren
Yue Wang
Hang Zhao
SSL
OOD
119
187
0
02 May 2021
Hyperspherically Regularized Networks for Self-Supervision
Hyperspherically Regularized Networks for Self-Supervision
A. Durrant
Georgios Leontidis
SSL
70
7
0
29 Apr 2021
Emerging Properties in Self-Supervised Vision Transformers
Emerging Properties in Self-Supervised Vision Transformers
Mathilde Caron
Hugo Touvron
Ishan Misra
Hervé Jégou
Julien Mairal
Piotr Bojanowski
Armand Joulin
314
5,775
0
29 Apr 2021
A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive
  Learning
A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive Learning
Yao-Hung Hubert Tsai
Shaojie Bai
Louis-Philippe Morency
Ruslan Salakhutdinov
SSL
10
38
0
28 Apr 2021
Semi-Supervised Semantic Segmentation with Pixel-Level Contrastive
  Learning from a Class-wise Memory Bank
Semi-Supervised Semantic Segmentation with Pixel-Level Contrastive Learning from a Class-wise Memory Bank
Inigo Alonso
Alberto Sabater
David Ferstl
Luis Montesano
Ana C. Murillo
SSL
CLL
121
203
0
27 Apr 2021
How Well Does Self-Supervised Pre-Training Perform with Streaming Data?
How Well Does Self-Supervised Pre-Training Perform with Streaming Data?
Dapeng Hu
Shipeng Yan
Qizhengqiu Lu
Lanqing Hong
Hailin Hu
Yifan Zhang
Zhenguo Li
Xinchao Wang
Jiashi Feng
45
28
0
25 Apr 2021
DisCo: Remedy Self-supervised Learning on Lightweight Models with
  Distilled Contrastive Learning
DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
Yuting Gao
Jia-Xin Zhuang
Xiaowei Guo
Hao Cheng
Xing Sun
Ke Li
Feiyue Huang
31
40
0
19 Apr 2021
Exploring Visual Engagement Signals for Representation Learning
Exploring Visual Engagement Signals for Representation Learning
Menglin Jia
Zuxuan Wu
A. Reiter
Claire Cardie
Serge J. Belongie
Ser-Nam Lim
19
13
0
15 Apr 2021
Contrastive Learning of Global-Local Video Representations
Contrastive Learning of Global-Local Video Representations
Shuang Ma
Zhaoyang Zeng
Daniel J. McDuff
Yale Song
SSL
22
7
0
07 Apr 2021
Orthogonal Projection Loss
Orthogonal Projection Loss
Kanchana Ranasinghe
Muzammal Naseer
Munawar Hayat
Salman Khan
F. Khan
VLM
16
67
0
25 Mar 2021
Self-Supervised Training Enhances Online Continual Learning
Self-Supervised Training Enhances Online Continual Learning
Jhair Gallardo
Tyler L. Hayes
Christopher Kanan
CLL
25
68
0
25 Mar 2021
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy
  Labels
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels
Evgenii Zheltonozhskii
Chaim Baskin
A. Mendelson
A. Bronstein
Or Litany
SSL
22
92
0
25 Mar 2021
Characterizing and Improving the Robustness of Self-Supervised Learning
  through Background Augmentations
Characterizing and Improving the Robustness of Self-Supervised Learning through Background Augmentations
Chaitanya K. Ryali
D. Schwab
Ari S. Morcos
SSL
26
9
0
23 Mar 2021
Self-Supervised Classification Network
Self-Supervised Classification Network
Elad Amrani
Leonid Karlinsky
A. Bronstein
SSL
18
28
0
19 Mar 2021
Graph Self-Supervised Learning: A Survey
Graph Self-Supervised Learning: A Survey
Yixin Liu
Ming Jin
Shirui Pan
Chuan Zhou
Yu Zheng
Feng Xia
Philip S. Yu
SSL
19
542
0
27 Feb 2021
A Primer on Contrastive Pretraining in Language Processing: Methods,
  Lessons Learned and Perspectives
A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives
Nils Rethmeier
Isabelle Augenstein
SSL
VLM
87
90
0
25 Feb 2021
Previous
123...262728
Next