Estimating the nutritional content of meals from images is a promising but complex task in the field of digital health and dietary monitoring. Manual food logging is often burdensome and inaccurate, highlighting the need for automated systems that can assess nutrition directly from visual inputs. In collaboration with the Department of Endocrinology and Diabetology at University Hospital Ulm, this thesis focuses on building and evaluating models that estimate the nutritional value of meals based on meal photographs with physiological validation using wearable sensor data, including continuous glucose monitoring (CGM), electrocardiogram (ECG), and core body temperature measurements.
You will work with a dataset that includes images of real meals paired with verified nutritional information and corresponding wearable data collected in free-living conditions. The project focuses on machine learning to analyze food images for item classification, portion estimation, and nutritional content prediction. It simultaneously analyzes sensor data to model postprandial physiological responses and assess how well the predicted nutrition aligns with actual metabolic outcomes.
This thesis contributes to the development of automated, image-based dietary assessment tools that can support personalized nutrition and metabolic health monitoring with minimal user input.
References
1. Arefeen, A., Fessler, S., Mostafavi, S. M., Johnston, C. S., & Ghasemzadeh, H. (2025). MealMeter: Using Multimodal Sensing and Machine Learning for Automatically Estimating Nutrition Intake (Version 1). arXiv. doi.org/10.48550/ARXIV.2503.11683
2. Keller, M., Tai, C. A., Chen, Y., Xi, P., & Wong, A. (2024). NutritionVerse-Direct: Exploring Deep Neural Networks for Multitask Nutrition Prediction from Food Images (Version 1). arXiv. doi.org/10.48550/ARXIV.2405.07814