ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2308.04392
21
1

Let's Get Vysical: Perceptual Accuracy In Visual and Tactile Encodings

8 August 2023
Zhongzhen Xu
Kristin Williams
Emily Wall
ArXiv (abs)PDFHTML
Abstract

In this paper, we explore the effectiveness of tactile data encodings using swell paper in comparison to visual encodings displayed with SVGs for data perception tasks. By replicating and adapting Cleveland and McGill's graphical perception study for the tactile modality, we establish a novel tactile encoding hierarchy. In a study with 12 university students, we found that participants perceived visual encodings more accurately when comparing values, judging their ratios with lower cognitive load, and better self-evaluated performance than tactile encodings. However, tactile encodings differed from their visual counterparts in terms of how accurately values could be decoded from them. This suggests that data physicalizations will require different design guidance than that developed for visual encodings. By providing empirical evidence for the perceptual accuracy of tactile encodings, our work contributes to foundational research on forms of data representation that prioritize tactile perception such as tactile graphics.

View on arXiv
Comments on this paper