ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.07629
79
0

Exploring Mobile Touch Interaction with Large Language Models

11 February 2025
Tim Zindulka
Jannek Sekowski
Florian Lehmann
Daniel Buschek
ArXivPDFHTML
Abstract

Interacting with Large Language Models (LLMs) for text editing on mobile devices currently requires users to break out of their writing environment and switch to a conversational AI interface. In this paper, we propose to control the LLM via touch gestures performed directly on the text. We first chart a design space that covers fundamental touch input and text transformations. In this space, we then concretely explore two control mappings: spread-to-generate and pinch-to-shorten, with visual feedback loops. We evaluate this concept in a user study (N=14) that compares three feedback designs: no visualisation, text length indicator, and length + word indicator. The results demonstrate that touch-based control of LLMs is both feasible and user-friendly, with the length + word indicator proving most effective for managing text generation. This work lays the foundation for further research into gesture-based interaction with LLMs on touch devices.

View on arXiv
@article{zindulka2025_2502.07629,
  title={ Exploring Mobile Touch Interaction with Large Language Models },
  author={ Tim Zindulka and Jannek Sekowski and Florian Lehmann and Daniel Buschek },
  journal={arXiv preprint arXiv:2502.07629},
  year={ 2025 }
}
Comments on this paper