ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2203.01884
29
54

Graph Neural Networks for Multimodal Single-Cell Data Integration

3 March 2022
Haifang Wen
Jiayuan Ding
Wei Jin
Yiqi Wang
Yuying Xie
Jiliang Tang
ArXivPDFHTML
Abstract

Recent advances in multimodal single-cell technologies have enabled simultaneous acquisitions of multiple omics data from the same cell, providing deeper insights into cellular states and dynamics. However, it is challenging to learn the joint representations from the multimodal data, model the relationship between modalities, and, more importantly, incorporate the vast amount of single-modality datasets into the downstream analyses. To address these challenges and correspondingly facilitate multimodal single-cell data analyses, three key tasks have been introduced: modality prediction\textit{modality prediction}modality prediction, modality matching\textit{modality matching}modality matching and joint embedding\textit{joint embedding}joint embedding. In this work, we present a general Graph Neural Network framework scMoGNN\textit{scMoGNN}scMoGNN to tackle these three tasks and show that scMoGNN\textit{scMoGNN}scMoGNN demonstrates superior results in all three tasks compared with the state-of-the-art and conventional approaches. Our method is an official winner in the overall ranking of Modality prediction\textit{Modality prediction}Modality prediction from NeurIPS 2021 Competition, and all implementations of our methods have been integrated into DANCE package~\url{https://github.com/OmicsML/dance}.

View on arXiv
Comments on this paper