ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.01660
17
0

STRAUSS: Sonification Tools & Resources for Analysis Using Sound Synthesis

2 April 2025
J. Trayford
Samantha Youles
C. M. Harrison
Rose Shepherd
Nicolas Bonne
ArXivPDFHTML
Abstract

Sonification, or conveying data using non-verbal audio, is a relatively niche but growing approach for presenting data across multiple specialist domains including astronomy, climate science, and beyond. The STRAUSS Python package aims to provide such a tool, which builds upon previous approaches to provide a powerful means to explore different ways of expressing data, with fine control over the output audio and its format. STRAUSS is a free, open source (FOSS) package, designed to allow flexible and effective sonification to be integrated into data workflows, in analogy to widely used visualisation packages. The remit of STRAUSS is broad; it is intended to be able to bridge between ad-hoc solutions for sonifying very particular datasets, and highly technical compositional and sound-design tools that are not optimised for sonification, or may have a steep learning curve. The code offers a range of approaches to sonification for a variety of contexts (e.g. science education, science communication, technical data analysis, etc). To this end, STRAUSS is packaged with a number of examples of different sonification approaches, and preset configurations to support "low-barrier, high-ceiling" approach. STRAUSS has been used to produce both educational resources and analysis tools.

View on arXiv
@article{trayford2025_2504.01660,
  title={ STRAUSS: Sonification Tools & Resources for Analysis Using Sound Synthesis },
  author={ James W. Trayford and Samantha Youles and Chris Harrison and Rose Shepherd and Nicolas Bonne },
  journal={arXiv preprint arXiv:2504.01660},
  year={ 2025 }
}
Comments on this paper