Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2209.04355
Cited By
MIntRec: A New Dataset for Multimodal Intent Recognition
9 September 2022
Hanlei Zhang
Huanlin Xu
Xin Eric Wang
Qianrui Zhou
Shaojie Zhao
Jiayan Teng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MIntRec: A New Dataset for Multimodal Intent Recognition"
9 / 9 papers shown
Title
Multimodal Transformers are Hierarchical Modal-wise Heterogeneous Graphs
Yijie Jin
Junjie Peng
Xuanchao Lin
Haochen Yuan
Lan Wang
Cangzhi Zheng
37
0
0
02 May 2025
Can Large Language Models Help Multimodal Language Analysis? MMLA: A Comprehensive Benchmark
Hanlei Zhang
Zhuohang Li
Yeshuang Zhu
Hua Xu
Peiwu Wang
Haige Zhu
Jie Zhou
Jinchao Zhang
39
0
0
23 Apr 2025
Identifying User Goals from UI Trajectories
Omri Berkovitch
Sapir Caduri
Noam Kahlon
Anatoly Efros
Avi Caciularu
Ido Dagan
LLMAG
35
4
0
20 Jun 2024
SDIF-DA: A Shallow-to-Deep Interaction Framework with Data Augmentation for Multi-modal Intent Detection
Shijue Huang
Libo Qin
Bingbing Wang
Geng Tu
Ruifeng Xu
20
4
0
31 Dec 2023
UniMSE: Towards Unified Multimodal Sentiment Analysis and Emotion Recognition
Guimin Hu
Ting-En Lin
Yi Zhao
Guangming Lu
Yuchuan Wu
Yongbin Li
31
110
0
21 Nov 2022
MAAS: Multi-modal Assignation for Active Speaker Detection
Juan Carlos León Alcázar
Fabian Caba Heilbron
Ali K. Thabet
Guohao Li
62
51
0
11 Jan 2021
Efficient Intent Detection with Dual Sentence Encoders
I. Casanueva
Tadas Temvcinas
D. Gerz
Matthew Henderson
Ivan Vulić
VLM
180
453
0
10 Mar 2020
A Stack-Propagation Framework with Token-Level Intent Detection for Spoken Language Understanding
Libo Qin
Wanxiang Che
Yangming Li
Haoyang Wen
Ting Liu
49
260
0
05 Sep 2019
Multimodal Compact Bilinear Pooling for Visual Question Answering and Visual Grounding
Akira Fukui
Dong Huk Park
Daylen Yang
Anna Rohrbach
Trevor Darrell
Marcus Rohrbach
152
1,464
0
06 Jun 2016
1