ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2109.13094
48
3

Inferring Facing Direction from Voice Signals

27 September 2021
Yu-Lin Wei
Rui Li
Abhinav Mehrotra
Romit Roy Choudhury
Nicholas D. Lane
    AAML
ArXivPDFHTML
Abstract

Consider a home or office where multiple devices are running voice assistants (e.g., TVs, lights, ovens, refrigerators, etc.). A human user turns to a particular device and gives a voice command, such as ``Alexa, can you ...''. This paper focuses on the problem of detecting which device the user was facing, and therefore, enabling only that device to respond to the command. Our core intuition emerges from the fact that human voice exhibits a directional radiation pattern, and the orientation of this pattern should influence the signal received at each device. Unfortunately, indoor multipath, unknown user location, and unknown voice signals pose as critical hurdles. Through a new algorithm that estimates the line-of-sight (LoS) power from a given signal, and combined with beamforming and triangulation, we design a functional solution called CoDIR. Results from 500+500+500+ configurations, across 555 rooms and 999 different users, are encouraging. While improvements are necessary, we believe this is an important step forward in a challenging but urgent problem space.

View on arXiv
Comments on this paper