<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Allgemein &#8211; Digital Health Bonn</title>
	<atom:link href="https://digital-health-bonn.de/category/allgemein/feed/" rel="self" type="application/rss+xml" />
	<link>https://digital-health-bonn.de</link>
	<description>Working Group &#34;Personalized Digital Health and Telemedicine&#34;</description>
	<lastBuildDate>Mon, 09 Mar 2026 15:51:15 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
	<item>
		<title>Seven Contributions from Our Group at the DGfE 2026 Annual Meeting</title>
		<link>https://digital-health-bonn.de/seven-contributions-from-our-group-at-the-dgfe-2026-annual-meeting/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 09 Mar 2026 15:34:35 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[AI in epilepsy]]></category>
		<category><![CDATA[computer vision in healthcare]]></category>
		<category><![CDATA[contactless vital sign monitoring]]></category>
		<category><![CDATA[DGfE 2026]]></category>
		<category><![CDATA[digital biomarkers]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[eye tracking]]></category>
		<category><![CDATA[human pose estimation]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[seizure detection]]></category>
		<category><![CDATA[video-EEG monitoring]]></category>
		<category><![CDATA[wearable health monitoring]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=770</guid>

					<description><![CDATA[Researchers from the Personalized Digital Health and Telemedicine Group at the University Hospital Bonn (UKB) will present seven scientific contributions at the 64th Annual Meeting of the German Society for Epileptology (DGfE) in Würzburg. The accepted contributions highlight our group’s [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Researchers from the <strong>Personalized Digital Health and Telemedicine Group</strong> at the <strong>University Hospital Bonn (UKB)</strong> will present <strong>seven scientific contributions at the 64th Annual Meeting of the German Society for Epileptology (DGfE)</strong> in Würzburg.</p>



<p>The accepted contributions highlight our group’s interdisciplinary research at the intersection of <strong>artificial intelligence, digital health, computer vision, wearable sensing, and clinical epileptology</strong>. Our work focuses on the development of <strong>digital biomarkers and AI-based analysis methods</strong> to improve the monitoring, diagnosis, and understanding of epilepsy.</p>



<h3 class="wp-block-heading">Research Topics</h3>



<p>The presented studies cover several emerging directions in <strong>AI-driven epilepsy research</strong>, including:</p>



<ul class="wp-block-list">
<li><strong>Contactless measurement of vital parameters</strong> using camera-based sensing technologies</li>



<li><strong>3D human pose estimation for video-EEG monitoring</strong> to analyze seizure-related body movements</li>



<li><strong>Eye-tracking metrics for cognitive assessment in epilepsy</strong>, including contextualized analysis of visual search strategies</li>



<li><strong>Effects of anti-seizure medication on eye-tracking parameters</strong> in patients with epilepsy</li>



<li><strong>Eye tracking during the Trail Making Test</strong> and its association with depressive symptoms</li>



<li><strong>Secure and trustworthy AI models for seizure detection using wearable devices</strong></li>



<li><strong>Electrocardiographic changes under cenobamate therapy</strong> based on extended ECG recordings</li>
</ul>



<p>These contributions reflect ongoing collaborations between <strong>computer scientists, physicists, neurologists, and clinical researchers</strong> at the University Hospital Bonn and partner institutions.</p>



<p>A large part of the work has been carried out by <strong>PhD students and early-career researchers</strong>, demonstrating the strong integration of young scientists in our research activities.</p>



<p>We look forward to presenting our work in Würzburg and discussing these results with colleagues from the international <strong>epilepsy research and digital medicine community</strong>.</p>


<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><h3 class="tp_h3" id="tp_h3_2026">2026</h3><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Simsek, Koray;  Müllers, Johannes;  Surges, Rainer;  Krüger, Björn</p><p class="tp_pub_title">Kontaktloses kamerabasiertes Messen von Vitalparametern <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_123" class="tp_show" onclick="teachpress_pub_showhide('123','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_123" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{simsek2026a,<br />
title = {Kontaktloses kamerabasiertes Messen von Vitalparametern},<br />
author = {Koray Simsek and Johannes Müllers and Rainer Surges and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('123','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Jansen, Anna;  Steininger, Melissa;  Mustafa, Sarah Al-Haj;  Bouzan, Nataly;  Surges, Rainer;  Helmstaedter, Christoph; von Wrede, Randi;  Krüger, Björn</p><p class="tp_pub_title">Kontextualisierte Eye-Tracking-Metriken zur Charakterisierung von Suchstrategien bei Personen mit Epilepsie und Kontrollen <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_124" class="tp_show" onclick="teachpress_pub_showhide('124','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_124" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{nokey,<br />
title = {Kontextualisierte Eye-Tracking-Metriken zur Charakterisierung von Suchstrategien bei Personen mit Epilepsie und Kontrollen},<br />
author = {Anna Jansen and Melissa Steininger and Sarah Al-Haj Mustafa and Nataly Bouzan and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('124','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Greß, Hannah;  Daryakenari, Nazila Ahmadi;  Bungartz, Christian;  Viola, Felix;  Markwald, Marco;  Brüll, Gabriela;  Kumar, Uttam;  Ohm, Marc;  Surges, Rainer;  Meier, Michael;  Demidova, Elena;  Krüger, Björn</p><p class="tp_pub_title">Anforderungen an sichere KI-Modelle zur Anfallsdetektion mit Wearables in der Epileptologie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_125" class="tp_show" onclick="teachpress_pub_showhide('125','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_125" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{gress2026a,<br />
title = {Anforderungen an sichere KI-Modelle zur Anfallsdetektion mit Wearables in der Epileptologie},<br />
author = {Hannah Greß and Nazila Ahmadi Daryakenari and Christian Bungartz and Felix Viola and Marco Markwald and Gabriela Brüll and Uttam Kumar and Marc Ohm and Rainer Surges and Michael Meier and Elena Demidova and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('125','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Vetter, Jonas;  Müllers, Johannes;  Spurio, Federico;  Surges, Rainer;  Gall, Juergen;  Krüger, Björn</p><p class="tp_pub_title">Kontaktlose 3D-Human-Pose-Estimation im Video-EEG-Monitoring von Epilepsiepatienten <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_126" class="tp_show" onclick="teachpress_pub_showhide('126','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_126" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{vetter2026,<br />
title = {Kontaktlose 3D-Human-Pose-Estimation im Video-EEG-Monitoring von Epilepsiepatienten},<br />
author = {Jonas Vetter and Johannes Müllers and Federico Spurio and Rainer Surges and Juergen Gall and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('126','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Pukropski, Jan;  Keßler, Lisa; von Wrede, Randi;  Surges, Rainer;  Krüger, Björn</p><p class="tp_pub_title">Elektrokardiographische Veränderungen unter Cenobamat – eine retrospektive Prä-Post-Analyse aus verlängerten EKG-Ableitungen bei Patient*innen mit Epilepsie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_127" class="tp_show" onclick="teachpress_pub_showhide('127','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_127" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{Pukropski2026a,<br />
title = {Elektrokardiographische Veränderungen unter Cenobamat – eine retrospektive Prä-Post-Analyse aus verlängerten EKG-Ableitungen bei Patient*innen mit Epilepsie},<br />
author = {Jan Pukropski and Lisa Keßler and Randi von Wrede and Rainer Surges and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('127','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Steininger, Melissa;  Jansen, Anna;  Mustafa, Sarah Al-Haj;  Bouzan, Nataly;  Surges, Rainer;  Helmstaedter, Christoph; von Wrede, Randi;  Krüger, Björn</p><p class="tp_pub_title">Zusammenhänge zwischen Anfallssuppressiva und kontextualisierten Eye-Tracking-Metriken bei Menschen mit Epilepsie <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_128" class="tp_show" onclick="teachpress_pub_showhide('128','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_128" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{steininger2026c,<br />
title = {Zusammenhänge zwischen Anfallssuppressiva und kontextualisierten Eye-Tracking-Metriken bei Menschen mit Epilepsie},<br />
author = {Melissa Steininger and Anna Jansen and Sarah Al-Haj Mustafa and Nataly Bouzan and Rainer Surges and Christoph Helmstaedter and Randi von Wrede and Björn Krüger},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('128','tp_bibtex')">Close</a></p></div></div></div><div class="tp_publication tp_publication_conference"><div class="tp_pub_info"><p class="tp_pub_author"> Mustafa, Sarah Al-Haj;  Jansen, Anna;  Steininger, Melissa;  Müllers, Johannes;  Surges, Rainer;  Helmstaedter, Christoph;  Krüger, Björn; von Wrede, Randi</p><p class="tp_pub_title">Wer suchet, der findet: Eye Tracking beim Trail Making Test bei Epilepsie – Zusammenhänge mit depressiver Symptomatik <span class="tp_pub_type tp_  conference">Conference</span> <span class="tp_pub_label_status forthcoming">Forthcoming</span></p><p class="tp_pub_additional"><span class="tp_pub_additional_booktitle">64. Jahrestagung der Deutschen Gesellschaft für Epileptologie, </span>Forthcoming.</p><p class="tp_pub_menu"><span class="tp_bibtex_link"><a id="tp_bibtex_sh_129" class="tp_show" onclick="teachpress_pub_showhide('129','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_129" style="display:none;"><div class="tp_bibtex_entry"><pre>@conference{mustafa2026a,<br />
title = {Wer suchet, der findet: Eye Tracking beim Trail Making Test bei Epilepsie – Zusammenhänge mit depressiver Symptomatik},<br />
author = {Sarah Al-Haj Mustafa and Anna Jansen and Melissa Steininger and Johannes Müllers and Rainer Surges and Christoph Helmstaedter and Björn Krüger and Randi von Wrede},<br />
year  = {2026},<br />
date = {2026-06-13},<br />
urldate = {2026-06-13},<br />
booktitle = {64. Jahrestagung der Deutschen Gesellschaft für Epileptologie},<br />
keywords = {},<br />
pubstate = {forthcoming},<br />
tppubtype = {conference}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('129','tp_bibtex')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>New Publication: Activity-Based Emotion Detection Using Wearable Sensors and Transfer Learning</title>
		<link>https://digital-health-bonn.de/new-publication-activity-based-emotion-detection-using-wearable-sensors-and-transfer-learning/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Wed, 25 Feb 2026 15:40:24 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[activity recognition]]></category>
		<category><![CDATA[affective computing]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[emotion detection]]></category>
		<category><![CDATA[emotion recognition from movement]]></category>
		<category><![CDATA[human activity sensing]]></category>
		<category><![CDATA[machine learning]]></category>
		<category><![CDATA[transfer learning]]></category>
		<category><![CDATA[wearable IoT]]></category>
		<category><![CDATA[wearable machine learning]]></category>
		<category><![CDATA[wearable sensors]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=773</guid>

					<description><![CDATA[Understanding human emotions from everyday behavior is an important goal in digital health, affective computing, and wearable sensing. In our latest publication in the IEEE Internet of Things Journal, we investigate how movement data from wearable sensors can be used [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Understanding human emotions from everyday behavior is an important goal in <strong>digital health, affective computing, and wearable sensing</strong>. In our latest publication in the <strong>IEEE Internet of Things Journal</strong>, we investigate how <strong>movement data from wearable sensors</strong> can be used to infer emotional states.</p>



<p>The paper <strong>“From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems”</strong> explores how <strong>machine learning models trained for activity recognition can be adapted to detect emotional states</strong>. Using a <strong>cross-domain transfer learning approach</strong>, the study demonstrates how knowledge learned from physical activity data can be transferred to emotion recognition tasks.</p>



<p>Wearable devices continuously capture <strong>movement patterns, step dynamics, and physical activity signals</strong>, which provide valuable information about human behavior in everyday environments. By leveraging these signals, the proposed approach opens new opportunities for <strong>non-invasive emotion monitoring using wearable IoT systems</strong>.</p>



<p>Such technologies have potential applications in <strong>mental health monitoring, digital phenotyping, and personalized healthcare</strong>, where unobtrusive sensing and intelligent data analysis can support early detection of behavioral and emotional changes.</p>



<p>This publication also continues a <strong>long-standing research collaboration between Qaiser Riaz and Björn Krüger</strong>, focusing on <strong>machine learning methods for human movement analysis and wearable sensor systems</strong>.</p>



<p>The full publication can be accessed via IEEE Xplore:<br><a href="https://ieeexplore.ieee.org/document/11404152">https://ieeexplore.ieee.org/document/11404152</a></p>


<div class="teachpress_pub_list"><form name="tppublistform" method="get"><a name="tppubs" id="tppubs"></a></form><div class="teachpress_publication_list"><h3 class="tp_h3" id="tp_h3_2026">2026</h3><div class="tp_publication tp_publication_article"><div class="tp_pub_info"><p class="tp_pub_author"> Imran, Hamza Ali;  Riaz, Qaiser;  Hamza, Kiran;  Muhammad, Shaida;  Krüger, Björn</p><p class="tp_pub_title"><a class="tp_title_link" onclick="teachpress_pub_showhide('122','tp_links')" style="cursor:pointer;">From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems</a> <span class="tp_pub_type tp_  article">Journal Article</span> </p><p class="tp_pub_additional"><span class="tp_pub_additional_in">In: </span><span class="tp_pub_additional_journal">IEEE Internet of Things Journal, </span><span class="tp_pub_additional_pages">pp. 1-1, </span><span class="tp_pub_additional_year">2026</span>.</p><p class="tp_pub_menu"><span class="tp_abstract_link"><a id="tp_abstract_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_abstract')" title="Show abstract" style="cursor:pointer;">Abstract</a></span> | <span class="tp_resource_link"><a id="tp_links_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_links')" title="Show links and resources" style="cursor:pointer;">Links</a></span> | <span class="tp_bibtex_link"><a id="tp_bibtex_sh_122" class="tp_show" onclick="teachpress_pub_showhide('122','tp_bibtex')" title="Show BibTeX entry" style="cursor:pointer;">BibTeX</a></span></p><div class="tp_bibtex" id="tp_bibtex_122" style="display:none;"><div class="tp_bibtex_entry"><pre>@article{Imran2026a,<br />
title = {From Steps to Sentiments: Cross-Domain Transfer Learning for Activity-Based Emotion Detection in Wearable IoT Systems},<br />
author = {Hamza Ali Imran and Qaiser Riaz and Kiran Hamza and Shaida Muhammad and Björn Krüger},<br />
doi = {10.1109/JIOT.2026.3666469},<br />
year  = {2026},<br />
date = {2026-02-20},<br />
urldate = {2026-02-20},<br />
journal = {IEEE Internet of Things Journal},<br />
pages = {1-1},<br />
abstract = {Context-aware, gait-based sentiment analysis and emotion perception is an emerging research area within Internet of Things (IoT), aiming to make smart systems more intuitive and responsive. Recognizing emotions from wearable inertial sensor data is challenging due to subtle and compound emotional cues, variability across individuals and contexts, and limited, imbalanced datasets. To address these challenges, we propose Jazbat-Net, a lightweight neural network that leverages Transfer Learning (TL). The model is first trained on a large-scale, publicly available multi-activity dataset collected using wearable inertial sensors, and then retrained on a multi-class emotion dataset, effectively transferring knowledge from the pretraining phase. We evaluate Jazbat-Net with and without TL, across both smartwatch and smartphone based data, and for input dimensions ranging from 1D to 6D. The best results are achieved when pretrained on smartphone-based activity data and retrained on smartphone-based emotion data using a 1D input size. The proposed model attains an average classification accuracy of 95%, with a precision score of 95%, a recall score of 97%, and an F1-score of 96%. Moreover, Jazbat-Net achieves a low theoretical time complexity and requires only ≈ 6.96 M Multiply–Accumulate Operations (MACs), which is about 95% fewer computations than the previous State-of-the-Art (SOTA) model. Its space complexity is also low, with a model size of only ≈ 110 KB and peak activation memory of ≈ 0.35 MB. On-device evaluation on a Xiaomi 13T smartphone demonstrates that Jazbat-Net achieves a median inference latency of only ≈ 90.96 ms with a TFLite 32-bit floating point precision (FP32) model size of just ≈ 0.158 MB, making it ≈ 20× smaller and ≈ 20% faster than the previous SOTA model while maintaining comparable accuracy.<br />
},<br />
keywords = {},<br />
pubstate = {published},<br />
tppubtype = {article}<br />
}<br />
</pre></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_bibtex')">Close</a></p></div><div class="tp_abstract" id="tp_abstract_122" style="display:none;"><div class="tp_abstract_entry">Context-aware, gait-based sentiment analysis and emotion perception is an emerging research area within Internet of Things (IoT), aiming to make smart systems more intuitive and responsive. Recognizing emotions from wearable inertial sensor data is challenging due to subtle and compound emotional cues, variability across individuals and contexts, and limited, imbalanced datasets. To address these challenges, we propose Jazbat-Net, a lightweight neural network that leverages Transfer Learning (TL). The model is first trained on a large-scale, publicly available multi-activity dataset collected using wearable inertial sensors, and then retrained on a multi-class emotion dataset, effectively transferring knowledge from the pretraining phase. We evaluate Jazbat-Net with and without TL, across both smartwatch and smartphone based data, and for input dimensions ranging from 1D to 6D. The best results are achieved when pretrained on smartphone-based activity data and retrained on smartphone-based emotion data using a 1D input size. The proposed model attains an average classification accuracy of 95%, with a precision score of 95%, a recall score of 97%, and an F1-score of 96%. Moreover, Jazbat-Net achieves a low theoretical time complexity and requires only ≈ 6.96 M Multiply–Accumulate Operations (MACs), which is about 95% fewer computations than the previous State-of-the-Art (SOTA) model. Its space complexity is also low, with a model size of only ≈ 110 KB and peak activation memory of ≈ 0.35 MB. On-device evaluation on a Xiaomi 13T smartphone demonstrates that Jazbat-Net achieves a median inference latency of only ≈ 90.96 ms with a TFLite 32-bit floating point precision (FP32) model size of just ≈ 0.158 MB, making it ≈ 20× smaller and ≈ 20% faster than the previous SOTA model while maintaining comparable accuracy.<br />
</div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_abstract')">Close</a></p></div><div class="tp_links" id="tp_links_122" style="display:none;"><div class="tp_links_entry"><ul class="tp_pub_list"><li><i class="ai ai-doi"></i><a class="tp_pub_list" href="https://dx.doi.org/10.1109/JIOT.2026.3666469" title="Follow DOI:10.1109/JIOT.2026.3666469" target="_blank">doi:10.1109/JIOT.2026.3666469</a></li></ul></div><p class="tp_close_menu"><a class="tp_close" onclick="teachpress_pub_showhide('122','tp_links')">Close</a></p></div></div></div></div></div>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Abstracts Accepted for International AI in Epilepsy Conference</title>
		<link>https://digital-health-bonn.de/abstracts-accepted-for-international-ai-in-epilepsy-conference/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Mon, 22 Dec 2025 16:14:09 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=711</guid>

					<description><![CDATA[We are pleased to share some good news at the end of the year: two abstracts from our group have been accepted for the 4th International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders, which will take place in [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We are pleased to share some good news at the end of the year: two abstracts from our group have been accepted for the <strong>4th International Conference on Artificial Intelligence in Epilepsy and Neurological Disorders</strong>, which will take place in Puerto Rico from <strong>March 16–19, 2026</strong>.</p>



<p>The accepted contributions are:</p>



<ul class="wp-block-list">
<li><em>Linking Higher-Level Eye Tracking Metrics to High-Impact Antiseizure Medication in Epilepsy Patients</em></li>



<li><em>Higher-Level Eye Tracking Metrics Reveal Search Behaviour Differences in Persons with Epilepsy vs. Healthy Controls</em></li>
</ul>



<p>Anna Jansen and Melissa Steininger will present this work at the conference.</p>



<p>We look forward to continuing our research on digital health and AI in epilepsy in the coming year and wish everyone a restful holiday season.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Björn Krüger gave a keynote at the Symposium on AI in Medicine &#038; Food Science</title>
		<link>https://digital-health-bonn.de/bjorn-kruger-gave-a-keynote-at-the-symposium-on-ai-in-medicine-food-science/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Thu, 11 Dec 2025 08:22:29 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=691</guid>

					<description><![CDATA[Prof. Björn Krüger was invited to deliver a keynote at the Symposium on AI in Medicine &#38; Food Science in Bonn (https://albarqouni.github.io/funded/symposium/).His talk, titled “Seeing, Sensing, Securing: AI in Modern Medicine,” explored current and emerging applications of AI in clinical [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Prof. <a href="/univ-prof-dr-rer-nat-bjorn-kruger">Björn Krüger</a> was invited to deliver a keynote at the <em>Symposium on AI in Medicine &amp; Food Science</em> in Bonn (<a href="https://albarqouni.github.io/funded/symposium/?utm_source=chatgpt.com">https://albarqouni.github.io/funded/symposium/</a>).<br>His talk, titled <strong>“Seeing, Sensing, Securing: AI in Modern Medicine,”</strong> explored current and emerging applications of AI in clinical practice and highlighted ongoing research activities within his group.</p>



<figure class="wp-block-image size-large"><img fetchpriority="high" decoding="async" width="1024" height="768" src="https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c-1024x768.jpg" alt="" class="wp-image-692" style="aspect-ratio:1;object-fit:cover" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c-1024x768.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c-300x225.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c-768x576.jpg 768w, https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c-1536x1152.jpg 1536w, https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c-360x270.jpg 360w, https://digital-health-bonn.de/wp-content/uploads/2025/12/WhatsApp-Bild-2025-12-11-um-06.10.53_b28abb5c.jpg 1600w" sizes="(max-width: 1024px) 100vw, 1024px" /></figure>



<p>The symposium brought together an international community of researchers, offering a rich setting for scientific exchange. Prof. Krüger particularly valued the discussions that followed his keynote, which touched on both technical developments and broader questions surrounding trustworthy AI in medicine.</p>



<p>The event was organized by <a href="https://albarqouni.github.io/authors/admin/">Shadi Albarqouni</a>, whose efforts created an inspiring environment for interdisciplinary dialogue.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The VIRTOSHA Project at MEDICA 2025</title>
		<link>https://digital-health-bonn.de/the-virtosha-project-at-medica-2025/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Wed, 19 Nov 2025 07:15:02 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[medica]]></category>
		<category><![CDATA[virtosha]]></category>
		<category><![CDATA[virtual reality]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=669</guid>

					<description><![CDATA[Yesterday, Anna Jansen and Kristian Welle presented our VIRTOSHA project in the Innovation#Area at the NRW state booth at MEDICA. In a short talk, Kristian outlined the project’s goals and technical approach. A particular highlight was the drilling simulation demo [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Yesterday, Anna Jansen and Kristian Welle presented our <a href="https://digital-health-bonn.de/virtosha/" data-type="page" data-id="109">VIRTOSHA </a>project in the Innovation#Area at the NRW state booth at MEDICA.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img decoding="async" width="2190" height="1643" data-id="673" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-2.webp" alt="" class="wp-image-673" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-2.webp 2190w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-2-300x225.webp 300w" sizes="(max-width: 2190px) 100vw, 2190px" /></figure>



<figure class="wp-block-image size-large"><img decoding="async" width="768" height="1024" data-id="670" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1-768x1024.webp" alt="" class="wp-image-670" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1-768x1024.webp 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1-225x300.webp 225w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1.webp 1232w" sizes="(max-width: 768px) 100vw, 768px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="768" height="1024" data-id="671" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-768x1024.webp" alt="" class="wp-image-671" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-768x1024.webp 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-225x300.webp 225w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3-1152x1536.webp 1152w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-3.webp 1349w" sizes="auto, (max-width: 768px) 100vw, 768px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="672" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1024x768.webp" alt="" class="wp-image-672" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/image-1024x768.webp 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-300x225.webp 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-768x576.webp 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/image-360x270.webp 360w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<p>In a short talk, Kristian outlined the project’s goals and technical approach. A particular highlight was the drilling simulation demo with a haptic arm, which provided highly realistic tactile feedback and sparked strong interest among visitors.</p>



<p>VIRTOSHA fits seamlessly into the HealthTech.NRW focus area of high-tech surgery – an environment where robots, lasers, and VR are already part of everyday clinical practice in NRW, significantly enhancing patient safety and treatment quality.</p>



<p>Many thanks to all visitors for the inspiring conversations and positive feedback!</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Panel “AI for Health and Health for AI”</title>
		<link>https://digital-health-bonn.de/panel-ai-for-health-and-health-for-ai/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Tue, 28 Oct 2025 10:52:00 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[conference]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=676</guid>

					<description><![CDATA[Prof. Björn Krüger was invited to participate in the panel “AI for Health and Health for AI” at the University of Saskatchewan. In his talk, “AI for Health: From Targeted Algorithms to Real-Life Impact,” he presented current research from the [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Prof. Björn Krüger was invited to participate in the panel <em>“AI for Health and Health for AI”</em> at the University of Saskatchewan. In his talk, <strong>“AI for Health: From Targeted Algorithms to Real-Life Impact,”</strong> he presented current research from the Department of Epileptology at the University Hospital Bonn, where his group develops AI models to understand human motion and cognition in real-world healthcare contexts.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="678" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-1024x768.jpeg" alt="" class="wp-image-678" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-1024x768.jpeg 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-300x225.jpeg 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-768x576.jpeg 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-1536x1152.jpeg 1536w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793-360x270.jpeg 360w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412793.jpeg 1696w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="677" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-1024x768.jpeg" alt="" class="wp-image-677" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-1024x768.jpeg 1024w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-300x225.jpeg 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-768x576.jpeg 768w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939-360x270.jpeg 360w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1761326412939.jpeg 2000w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<p>His contribution highlighted ongoing projects ranging from wearable-based motion analysis—using lightweight models for activity and emotion recognition—to eye-tracking studies in epilepsy that investigate how cognitive changes and medication side effects manifest in eye movements.</p>



<p>The event offered a valuable opportunity for exchange with fellow panelists <strong>Stephen Lee</strong> (College of Medicine, University of Saskatchewan) and <strong>Raymond Ng</strong> (Data Science Institute / AI &amp; Health Network, University of British Columbia) and <strong>Daniel Fuller</strong> (University of Saskatchewan) who did an excellent job in moderation and organization.</p>



<p>The discussion underscored a central theme of his work: <strong>AI in healthcare is not only about accuracy—its true impact lies in improving quality of life, supporting clinicians, and empowering patients.</strong></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>University of Bonn’s “F*ckUp Night”</title>
		<link>https://digital-health-bonn.de/university-of-bonns-fckup-night/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Tue, 23 Sep 2025 13:17:00 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[outreach]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=681</guid>

					<description><![CDATA[Prof. Björn Krüger recently spoke at the F*ckUp Night hosted at the International Club of the University of Bonn, sharing personal experiences of failures, unexpected career turns, and rejected papers—topics that rarely appear in academic presentations but resonate strongly with [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Prof. Björn Krüger recently spoke at the <strong>F*ckUp Night</strong> hosted at the International Club of the University of Bonn, sharing personal experiences of failures, unexpected career turns, and rejected papers—topics that rarely appear in academic presentations but resonate strongly with researchers at all stages.</p>



<figure class="wp-block-image size-full"><img loading="lazy" decoding="async" width="800" height="800" src="https://digital-health-bonn.de/wp-content/uploads/2025/11/1760354411637.jpeg" alt="" class="wp-image-682" srcset="https://digital-health-bonn.de/wp-content/uploads/2025/11/1760354411637.jpeg 800w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1760354411637-150x150.jpeg 150w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1760354411637-300x300.jpeg 300w, https://digital-health-bonn.de/wp-content/uploads/2025/11/1760354411637-768x768.jpeg 768w" sizes="auto, (max-width: 800px) 100vw, 800px" /></figure>



<p>In his talk, he emphasized three key insights drawn from these experiences:</p>



<ol class="wp-block-list">
<li><strong>Peer review is unpredictable.</strong> Individual reviews often reflect randomness; they should not define one’s self-worth.</li>



<li><strong>Failures can spark better ideas.</strong> Rejections frequently lead to rethinking, refining, and ultimately creating stronger work.</li>



<li><strong>Career detours are valuable.</strong> Unexpected paths can lead to the places where one truly belongs.</li>
</ol>



<p>The event featured a series of inspiring contributions from speakers who shared their own stories of setback, learning, and resilience. </p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>🎉 Exciting News to Close the Year! 🎉</title>
		<link>https://digital-health-bonn.de/%f0%9f%8e%89-exciting-news-to-close-the-year-%f0%9f%8e%89/</link>
		
		<dc:creator><![CDATA[Björn Krüger]]></dc:creator>
		<pubDate>Thu, 19 Dec 2024 13:22:02 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<category><![CDATA[Publications]]></category>
		<category><![CDATA[digital health]]></category>
		<category><![CDATA[epilepsy]]></category>
		<category><![CDATA[machine learning]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=526</guid>

					<description><![CDATA[As the year comes to an end, we’re thrilled to share some updates from our research group: several of our submissions have been accepted for conferences in 2025! Accepted Contributions – Dreiländertagung Epilepsie 2025 in Salzburg:✅ Digital Transformation of Blood [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>As the year comes to an end, we’re thrilled to share some updates from our research group: several of our submissions have been accepted for conferences in 2025!</p>



<p>Accepted Contributions – Dreiländertagung Epilepsie 2025 in Salzburg:<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Digital Transformation of Blood Sample Management<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> EpiEye – Effects of Seizure-Suppressive Medications on Eye Movements and Autonomic Changes in Epilepsy<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Establishing an Epileptological Teleconsultation Between Siegen Hospital and University Hospital Bonn<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> ANNE – Eye-Tracking for Detecting Side Effects in Epilepsy<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Semiological Characteristics of Temporal Lobe Epilepsy with GAD65 Antibodies</p>



<p>Accepted Contributions – International Conference on Artificial Intelligence in Epilepsy and Other Neurological Disorders 2025:<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Eye-Tracking Reveals Search Behaviour in Epilepsy Patients<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Prediction Models on Eye-Tracking Data in Epilepsy</p>



<p>Accepted Contribution – Wissenschaftlichen Tagung Autismus-Spektrum (WTAS)<br><img src="https://s.w.org/images/core/emoji/17.0.2/72x72/2705.png" alt="✅" class="wp-smiley" style="height: 1em; max-height: 1em;" /> Multi-Stream Analysis for Robust Head Gesture Classification in Natural Social Interaction: Leveraging High-Resolution Sensor Data for Optimized Visual Classification</p>



<p>We sincerely thank all contributors for their dedication and hard work. We look forward to engaging discussions, inspiring exchanges, and exciting collaborations at these upcoming conferences!</p>



<p>Wishing everyone a successful and inspiring year ahead!</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Eye-Tracking Workshop at the UKB</title>
		<link>https://digital-health-bonn.de/eye-tracking-workshop-at-the-ukb/</link>
		
		<dc:creator><![CDATA[Hannah Greß]]></dc:creator>
		<pubDate>Mon, 02 Dec 2024 13:20:27 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=501</guid>

					<description><![CDATA[On November 22 we held an interdisciplinary eye-tracking workshop at the UKB to share experience with eye-tracking technology from Pupil Labs, Tobii, and Eyelink. We had talks from Philip Büchel, Tim Guth and Laura Nett (AG Kunz, UKB), Zuzanna Laudańska [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>On November 22 we held an interdisciplinary eye-tracking workshop at the UKB to share experience with eye-tracking technology from Pupil Labs, Tobii, and Eyelink. <br>We had talks from Philip Büchel, Tim Guth and Laura Nett (<a href="https://www.ukbonn.de/epileptologie/arbeitsgruppen/ag-kunz-kognitive-und-translationale-neurowissenschaften/" data-type="link" data-id="https://www.ukbonn.de/epileptologie/arbeitsgruppen/ag-kunz-kognitive-und-translationale-neurowissenschaften/">AG Kunz</a>, UKB), Zuzanna Laudańska and <a href="https://www.klinikum.uni-heidelberg.de/personen/dr-rer-nat-martin-schulte-ruether-14102" data-type="link" data-id="https://www.klinikum.uni-heidelberg.de/personen/dr-rer-nat-martin-schulte-ruether-14102">Martin Schulte-Rüther</a> (University Hospital Heidelberg), Anna Jansen, Johannes Müllers and Melissa Steiniger (our group), Sanna Stroth (<a href="https://www.uni-marburg.de/de/fb20/bereiche/zpg/kjp/forschung/ag-as/copy_of_wie-alles-begann" data-type="link" data-id="https://www.uni-marburg.de/de/fb20/bereiche/zpg/kjp/forschung/ag-as/copy_of_wie-alles-begann">AG Autismus-Spektrum</a>, Philipps-Universität Marburg Marburg) and <a href="https://www.thm.de/iem/dennis-m-poepperl" data-type="link" data-id="https://www.thm.de/iem/dennis-m-poepperl">Dennis Pöpperl</a> (Technische Hochschule Mittelhessen &#8211; University of Applied Sciences, THM) and Berkan Koyak (<a href="https://www.dzne.de/en/research/research-areas/clinical-research/research-groups/aziz/group-members/" data-type="link" data-id="https://www.dzne.de/en/research/research-areas/clinical-research/research-groups/aziz/group-members/">Population &amp; Clinical Neuroepidemiology</a>, DZNE). In the afternoon, our speakers and attendees had the opportunity to experiment with three hands-on workshops focusing on live data retrieval from IMU sensors and eye trackers, data analysis of eye-tracking data with python, and data analysis with the Pupil Labs Cloud. </p>



<p>The program of the workshop can be seen <a href="https://digital-health-bonn.de/eye-tracking-workshop/" data-type="link" data-id="https://digital-health-bonn.de/eye-tracking-workshop/">here</a>.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-3 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="516" src="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-1024x768.jpg" alt="" class="wp-image-516" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-1024x768.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-1536x1152.jpg 1536w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-2048x1536.jpg 2048w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-360x270.jpg 360w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-300x225.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_112209-1-768x576.jpg 768w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="518" src="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-1024x768.jpg" alt="" class="wp-image-518" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-1024x768.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-1536x1152.jpg 1536w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-2048x1536.jpg 2048w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-360x270.jpg 360w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-300x225.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_131221-768x576.jpg 768w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="519" src="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-1024x768.jpg" alt="" class="wp-image-519" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-1024x768.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-1536x1152.jpg 1536w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-2048x1536.jpg 2048w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-360x270.jpg 360w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-300x225.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_121822-768x576.jpg 768w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>



<figure class="wp-block-image size-large"><img loading="lazy" decoding="async" width="1024" height="768" data-id="517" src="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-1024x768.jpg" alt="" class="wp-image-517" srcset="https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-1024x768.jpg 1024w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-1536x1152.jpg 1536w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-2048x1536.jpg 2048w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-360x270.jpg 360w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-300x225.jpg 300w, https://digital-health-bonn.de/wp-content/uploads/2024/12/IMG_20241122_105411-2-768x576.jpg 768w" sizes="auto, (max-width: 1024px) 100vw, 1024px" /></figure>
</figure>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>VIRTOSHA in the General-Anzeiger</title>
		<link>https://digital-health-bonn.de/virtosha-in-the-general-anzeiger/</link>
		
		<dc:creator><![CDATA[Hannah Greß]]></dc:creator>
		<pubDate>Sun, 17 Nov 2024 10:00:00 +0000</pubDate>
				<category><![CDATA[Allgemein]]></category>
		<guid isPermaLink="false">https://digital-health-bonn.de/?p=474</guid>

					<description><![CDATA[&#8220;Damit die Fehler nur in der Simulation passieren&#8221; (&#8220;So that mistakes only happen in the simulation&#8221;; German, paywall) was the headline of the General-Anzeiger in its recent weekend edition, providing insight into our VIRTOSHA project. Based on an interview with [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p><a href="https://ga.de/news/wissen-und-bildung/regional/bonn-ukb-entwickelt-vr-training-fuer-chirurgen_aid-121126657" data-type="link" data-id="https://ga.de/news/wissen-und-bildung/regional/bonn-ukb-entwickelt-vr-training-fuer-chirurgen_aid-121126657">&#8220;Damit die Fehler nur in der Simulation passieren&#8221;</a> (&#8220;So that mistakes only happen in the simulation&#8221;; German, paywall) was the headline of the General-Anzeiger in its recent weekend edition, providing insight into our VIRTOSHA project. Based on an interview with <a href="https://www.ukbonn.de/en/epileptology/workgroups/wg-krueger-personlized-digital-health/">Prof. Björn Krüger</a> and <a href="https://www.ortho-unfall-bonn.de/klinik/team/oberarzte/">Dr. Kristian Welle</a> (both UKB), the current state-of-the-art, goals, and potential of the project, which are to be implemented over the next three years, were outlined.</p>



<p></p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
