ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1611.01603
144
2091
v1v2v3v4v5v6 (latest)

Bidirectional Attention Flow for Machine Comprehension

5 November 2016
Minjoon Seo
Aniruddha Kembhavi
Ali Farhadi
Hannaneh Hajishirzi
ArXiv (abs)PDFHTML
Abstract

Machine Comprehension (MC), answering questions about a given context, re-quires modeling complex interactions between the context and the query. Recently, attention mechanisms have been successfully extended to MC. Typically these mechanisms use attention to summarize the query and context into a single vectors, couple attentions temporally, and often form a unidirectional attention. In this paper we introduce the Bidirectional Attention Flow (BIDAF) Model, a multi-stage hierarchical process that represents the context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization. Our experimental evaluations show that our model achieves the state-of-the-art results in Stanford QA(SQuAD) and CNN/DailyMail Cloze Test datasets.

View on arXiv
Comments on this paper