Collecting Information Needs for Egocentric Visualizations while Running
Résumé
We investigate research challenges and opportunities for visualization in motion during outdoor physical activities via an initial corpus of real-world recordings that pair egocentric video, biometrics, and think-aloud observations. With the increasing use of tracking and recording devices, such as smartwatches and head-mounted displays, more and more data are available in real-time about a person's activity and the context of the activity. However, not all data will be relevant all the time. Instead, athletes have information needs that change throughout their activity depending on the context and their performance. To address this challenge, we describe the collection of a diverse corpus of information needs paired with contextualizing audio, video, and sensor data. Next, we propose a first set of research challenges and design considerations that explore the difficulties of visualizing those real data needs in-context and demonstrate a prototype tool for browsing, aggregating, and analyzing this information. Our ultimate goal is to understand and support embedding visualizations into outdoor contexts with changing environments and varying data needs.
Domaines
Interface homme-machine [cs.HC]
Fichier principal
1PVis2024_ImmersiveRunning.pdf (3.99 Mo)
Télécharger le fichier
Elshabasi_2024_CollectingNeeds.png (416.2 Ko)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Format | Figure, Image |
---|