Research

Traditional cognitive psychology often studies attention in a vacuum: static stimuli on a 2D screen, requiring simple button presses. While this has given us fundamental models of selection, it ignores a crucial reality: The real world is dynamic, multisensory, and requires active physical interaction.

My research program, Naturalistic Attention Dynamics, bridges this gap. I combine the rigor of experimental control with the complexity of ecological validity.

The Core Question:

How do humans select and process information when the environment is dynamic, multisensory, and full of distractions?


Complexity & Interaction (Beyond the Screen)

In the real world, attention is not just about where we look, but how we interact. I move beyond standard computer tasks to investigate attention in Virtual Reality (VR) and through “Haptic Foraging” – searching for and interacting with real, physical objects.

  • 3D & Depth: How does attention change when we search in a 3D space rather than a 2D plane?
  • Physical Effort: How does the cost of physical movement influence our decision to attend to or ignore information?
  • The Goal: To establish a “baseline” for healthy attentional behavior in complex environments, which is essential for understanding deviations in clinical or applied settings.

Psychophysiology as a Marker

Behavior (reaction time) tells us what happened, but physiology tells us how and at what cost. My lab uses advanced Eye-Tracking and Pupillometry to objectify internal cognitive states.

  • The Pupil as a Window: I utilize pupillometry as a non-invasive marker for activity in the Locus Coeruleus-Norepinephrine system.
  • Cognitive Load & Arousal: By analyzing pupil dilation and gaze patterns, we can objectively measure mental effort and stress levels in real-time, independent of subjective self-reports.

Application: Digital Health & Human-Tech Interaction

Understanding the fundamental dynamics of attention allows us to design better systems and diagnostics. My research translates these basic insights into applied domains:

  • E-Learning & Focus: Analyzing eye-movement synchronicity to predict learning success and optimize digital education formats.
  • Digital Phenotyping: Developing algorithms that use gaze data in VR to identify cognitive states like fatigue or overload, paving the way for adaptive assistance systems.
  • Safety & Performance: Investigating “Multiple Target Search” scenarios to reduce errors in high-stakes environments, such as medical monitoring or surveillance.

By integrating behavioral paradigms with mobile psychophysiology, my goal is to build a more complete model of human cognition – one that holds true not just in the lab, but in the complexity of everyday life.