Working Group "Personalized Digital Health and Telemedicine"
VISAKI
Virtual Interaction for Supporting Children with Social Anxiety Disorders (VISAKI)
VISAKI is an innovative research project funded by the German Federal Ministry of Education and Research (BMBF). It aims to develop a virtual reality (VR) platform designed for children aged 8-12 diagnosed with social anxiety disorders. Combining immersive VR experiences with gamification elements, the platform creates a safe and controlled environment where children can practice social interactions, strengthen their social skills, and build resilience. The platform supports therapeutic processes through interactive scenarios such as multi-user group sessions and guided exposure exercises. It addresses gaps in traditional treatment methods, especially in rural areas with limited access to therapy, ensuring continuous therapeutic engagement even during treatment breaks.
Role of the Krüger group at UKB
The research group led by Prof. Dr. Björn Krüger at the University Hospital Bonn (UKB) focuses on designing, developing, and evaluating avatars and multi-user interaction models for the VR platform. Their tasks include:
Avatar Design and Personalization: Creating avatars tailored to children’s therapeutic needs with attention to emotional expression, personalization, and inclusivity.
Game Design and Social Interactions: Developing engaging game mechanics and scenarios that facilitate real-time social interactions in virtual environments.
Data Analysis and Evaluation: Conducting in-depth data analysis on avatar interactions, user behavior, and system usability to ensure the effectiveness of the platform.
Scientific Contribution: Collaborating with clinical and technical partners to align the platform’s design with therapeutic goals, contributing to academic research and dissemination through scientific publications and conferences.
These contributions ensure that the VISAKI platform meets the highest standards of scientific and clinical relevance while promoting its real-world applicability in pediatric mental health care.
Funded by:
Publications:
2025
Jansen, Anna; Morev, Nikita; Steininger, Melissa; Müllers, Johannes; Krüger, Björn
Synthetic Hand Dataset Generation: Multi-View Rendering and Annotation with Blender ConferenceForthcoming
Proceedings IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Forthcoming.
@conference{jansen2025c,
title = {Synthetic Hand Dataset Generation: Multi-View Rendering and Annotation with Blender},
author = {Anna Jansen and Nikita Morev and Melissa Steininger and Johannes Müllers and Björn Krüger},
year = {2025},
date = {2025-10-06},
booktitle = {Proceedings IEEE International Symposium on Mixed and Augmented Reality (ISMAR)},
abstract = {Pose estimation is a common method for precise handtracking, which is important for natural interaction in virtual reality (VR). However, training those models requires large-scale datasets with accurate 3D annotations. Those are difficult to obtain due to the time-consuming data collection and the limited variety in captured scenarios. We present a work-in-progress Blender-based pipeline for generating synthetic multi-view hand datasets. Our system simulates Ultraleap Stereo IR 170-style images and extracts joint positions directly from a rigged hand model, eliminating the need for manual labeling or external tracking processes. The current pipeline version supports randomized static poses with per-frame annotations of joint positions, camera parameters, and rendered images. While extended hand variation, animation features, and different sensor-type simulations are still in progress, our pipeline already provides a flexible foundation for customizable dataset generation and reproducible hand-tracking model training.},
keywords = {},
pubstate = {forthcoming},
tppubtype = {conference}
}
Pose estimation is a common method for precise handtracking, which is important for natural interaction in virtual reality (VR). However, training those models requires large-scale datasets with accurate 3D annotations. Those are difficult to obtain due to the time-consuming data collection and the limited variety in captured scenarios. We present a work-in-progress Blender-based pipeline for generating synthetic multi-view hand datasets. Our system simulates Ultraleap Stereo IR 170-style images and extracts joint positions directly from a rigged hand model, eliminating the need for manual labeling or external tracking processes. The current pipeline version supports randomized static poses with per-frame annotations of joint positions, camera parameters, and rendered images. While extended hand variation, animation features, and different sensor-type simulations are still in progress, our pipeline already provides a flexible foundation for customizable dataset generation and reproducible hand-tracking model training.
@conference{steininger2025b,
title = {Optimized Sensor Position Detection: Improving Visual Sensor Setups for Hand Tracking in VR},
author = {Melissa Steininger and Anna Jansen and Kristian Welle and Björn Krüger},
year = {2025},
date = {2025-03-12},
urldate = {2025-03-12},
booktitle = {2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR)},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}