About



I'm a research scientist at Meta - Reality Labs working on augmented and virtual reality technologies. During my Ph.D. at CMU - SCS - HCII I developed mobile health interventions that changed dynamically with the patient's context, disease progression and preferences.

My work has resulted in over 1100+ citations and 19 publications at top venues (CHI, IMWUT(Ubicomp), PERCOM). I have received multiple distinctions like: the 2019 Microsoft Dissertation Grant, The 2017 Digital Health Fellowship by the CMU Center of Machine Learning and Health, a best paper award in Ubicomp-2016 and I was a finalist for the 2016 Facebook Fellowship.

julian pic

Research Overview



I started my research developing sensing and machine learning techniques to detect basic human behavior like walking and sitting, then I move on to identify people's state of mind like stress and interruptibility. Currently, I use those basic sensing techniques to inform mobile health interventions. I use reinforcement learning and human feedback to adapt interventions that maximize adherence and health outcomes.

Arrows indicate how some projects informed or are follow up work on the same area. This diagram was inspired by Gierad Laput's Research overview map.

Projects


Most of the projects below have resulted in highly influential and cited work across venues like Ubicomp, IEEE Transactions on Human-Machine Systems, and CHI among others.


Mobile Health Interventions

press to learn more

Reinforcement learning methods that adapt mobile health interventions autonomously to the user using sensor data and human-feedback.


Interruptibility Detection

press to learn more

Detecting from smartphone sensors data when users are available to click on a notification.


Stress Recognition

press to learn more

Detection of stress episodes from physiological signals (e.g., heart rate, breathing rate) while stationary or exercising (e.g., walking, running, cycling) at different effort levels.


Activity Recognition

press to learn more

Recognition of activities of daily living (e.g., walking, falling, sitting) from accelerometer data


Interaction Techniques

press to learn more

Novel techniques for pointing and gesture recognition.


Data Science

press to learn more

Understanding smartphone users and data-scientists.

Publications


© 2020. Julian Ramos.

Based on Creative template