Resume and JobRESUME AND JOB
All Profiles
Ayca Aygun profile photo

Ayca Aygun

MA, USA

🎓 Ph.D. in Computer Science (Tufts University, 2025) with 10+ years of applied research experience at the intersection of Human-Computer Interaction (HCI), cognitive science, and physiological sensing. My work is driven by a core goal: to better understand human physiological and mental states through computational methods and real-time sensing technologies. 🧠 I specialize in physiological signal processing and multimodal cognitive state tracking, leveraging signals such as PPG, ECG, EEG, eye gaze, respiration, and skin conductance. These capabilities support applications in adaptive robotics, human-system interaction, and intelligent user interfaces. 🧪 I have led the end-to-end design and execution of human-participant studies, including IRB protocol development, participant recruitment, experimental design, data collection, and analysis. My research has been conducted in collaboration with interdisciplinary teams and funded by major organizations such as AFOSR, NSF, NIH, and PATHS-UP. 📊 Technically proficient in Python, R, C#, SQL, PL/SQL, Javascript, and MATLAB, with deep experience in time-series modeling, statistical analysis, data visualization, machine learning, and deep learning. I also bring strong skills in UX research, sensor-based algorithm development, and cross-functional project management. 👩‍🏫 In parallel to my research, I’ve served as a teaching assistant for undergraduate and graduate-level courses in Electrical Circuits and Machine Learning, and as a guest lecturer for Introduction to Cognitive & Brain Science. I have also mentored students across multiple labs on both theoretical concepts and practical research methods. 📚 My work has been published in top venues including IEEE JBHI, IEEE EMBS BHI, ICMI, Sensors, and CogSci.

About

Ph.D. in Computer Science (Tufts University, 2025) with 10+ years of applied research experience at the intersection of HCI, cognitive science, and physiological sensing. Led end-to-end human-participant studies, including IRB protocols, recruitment, experimental design, data collection, and cross-functional project management. Developed real-time multimodal systems for cognitive state tracking using PPG, ECG, EEG, eye gaze, respiration, and skin conductance, with applications in adaptive robotics and human-system interaction. Extensive experience in Python, R, MATLAB, C#, SQL, and JavaScript for time-series modeling, statistical analysis, visualization, machine learning, and deep learning. Skilled in study design, UX research, and sensor-based algorithm development. Published in top venues (IEEE JBHI, ICMI, CogSci) and led projects funded by AFOSR, NSF, NIH, and PATHS-UP, regularly presenting findings to senior stakeholders and funding agencies.

Experience

Graduate Research/Teaching Assistant

Human-Robot Interaction Lab., Tufts University

Jun 2021 – Jun 2025Medford, MAFull-Time

Real-Time Cognitive Workload Estimation in VR HRI Teams: Led the design and development of a real-time multimodal HRI platform with two humans and two robots working within a Unity3D VR space station. Developed eye-gaze–based workload prediction integrated into the DIARC robotic architecture to enable adaptive robot behavior. Demonstrated that online cognitive-state monitoring improves team coordination and performance during high-demand tasks. Related paper submitted to AAMAS 2026, currently under review. Multimodal Cognitive State Estimation in Driving Simulation: Conducted detailed analysis of eye-gaze, EEG, and arterial blood pressure signals to estimate workload during multitasking driving scenarios. Showed pupil diameter to be the strongest standalone workload predictor, outperforming multimodal signal combinations. Built ML and statistical pipelines to evaluate classification accuracy under varied task conditions. Results published in Sensors, CogSci, and ACM ICMI, with the dataset paper accepted to Nature Scientific Data (first author, in press). Modeling Interactions Between Systemic Cognitive States: Investigated how workload, sense of urgency, and mind wandering interact during multitasking. Using eye-gaze–based PCPS metrics, demonstrated that consecutive urgency events significantly elevate workload. Provided foundational evidence of dynamic cognitive-state inter- actions relevant to human–robot teaming and neurodiverse populations. VACP-Based Workload Modeling from Eye Gaze: Designed a Visual–Auditory–Cognitive–Psychomotor (VACP) workload model derived from eye-gaze behavior in driving simulations. Quantified how pupil dilation corresponds to VACP levels and revealed strong correlations between pupil metrics and cognitive workload, supporting VACP as a viable framework for multitasking environments. Proactive Robot Behaviors for Enhanced Team Performance: Explored how proactive, active, and reactive robot behaviors influence human workload in HRI teams. Developed eye-gaze–based workload evaluation algorithms, applied advanced preprocessing, and performed statistical analyses showing that proactive robot behaviors reduce human workload more effectively than shared mental models alone. Single-Trial ERP-Based Workload Prediction: Developed methods for extracting single-trial ERPs (N1, N2, P3) from EEG signals during dual-task driving experiments. Implemented bandpass filtering, ICA-based blink removal, and Kalman smoothing to isolate trial-specific neural responses, enabling fine-grained, real-time cognitive work- load modeling. Robot-Assisted Cognitive Workload and Learning: Designed a robot-assisted sign-language learning system to study how gesture-based interactions influence cognitive workload and memory. Found that well-structured robot feedback enhances learning and supports the development of personalized robotic tutoring approaches, including applications for neurodiverse learners. Graduate Teaching Assistant: Introduction to Machine Learning: Supported course delivery for two semesters (Spring 2024 and Spring 2025) for 220+ undergraduate and graduate students. Collaborated with the instructor to prepare exams, assignments, and projects graded assessments, projects, and exams developed the course website and syllabus and held weekly office hours to help students grasp core ML/DL concepts.

Graduate Research Assistant

Texas A&M University

Jan 2018 – May 2021College Station, TXFull-Time

Networked Multiagent Systems Lab., Modeling Infectious Disease Transmission in Meta-population Networks: Developed and analyzed a structured meta-population model to study how local preparedness and awareness levels affect outbreak size across sequentially connected localities. Showed that when each locality adapts based on its immediate neighbor—rather than the origin—outbreak impacts are minimized and distant localities experience significantly fewer cases. Embedded Signal Processing Lab., Robust IBI and HRV Estimation from Wearable Cardiac Sensors: Developed a robust method to accurately predict interbeat interval (IBI) and heart rate variability (HRV) from wearable cardiac signals affected by mo- tion artifacts. The approach combines two key algorithmic components: (1) a shortest-path heartbeat detection algorithm leveraging morphological features and time-continuity of heartbeats, and (2) a fusion technique that integrates estimates from multiple features for improved accuracy. This work was supported by NIH, NSF, and PATHS-UP, and validated on PPG and ECG data under exercise conditions. Integrated Neuro-Prosthesis Lab., Biomedical Signal Feature Extraction and Classification of Bowel Sounds: Recorded long-duration bowel sounds using a specialized stethoscope and analyzed the signals to identify mass peristalsis events and their tim- ing. Developed methods to extract key features and classify bowel sounds, enabling non-invasive monitoring of gastrointestinal activity. This work provides valuable insights for bedridden patients, supporting better clinical assessment and patient care.

Senior Software Developer

Ing Bank

Jun 2016 – Aug 2017Istanbul, TurkiyeFull-Time

Focus: Design and development of Tibco/Oracle web services, user interfaces, and stored procedures at database side used for intranet applications.

Senior Middleware Systems Engineer

Akbank

Mar 2015 – May 2016Istanbul, TurkiyeFull-Time

Focus: Administration of Tibco/Oracles web services. Development of shell scripts which are used to monitor the application servers

Middleware Systems Engineer

Turk Telekom

Jun 2012 – Mar 2015Istanbul, TurkiyeFull-Time

Focus: Administration of Weblogic Server 11g/8.1 and SOA Suite 11g with development and deployment of the applications.

Software Developer

I2I Systems

Feb 2011 – Nov 2011Istanbul, TurkiyeFull-Time

Focus: Development of software packages related to billing and rating systems.

Software Developer

Ericsson

Jul 2009 – Feb 2011Istanbul, TurkiyeFull-Time

Focus: Development of software packaged related to costumer relationship management (CRM) and human re- sources management (HRM) systems.

Education

Ph.D. in Computer Science

Tufts University

Jun 2021

Dissertation: Multimodal Physiological Signal Processing for Cognitive State Estimation in Real-Time HRI.

M.S. in Electrical and Computer Engineering

Texas A&M University

Jan 2018 – May 2021

Interbeat Interval and Heart Rate Variability Estimation Method from Various Morphological Features using Wearable Sensors

M.S. in Biomedical Engineering

Istanbul Technical University

Sep 2010 – May 2015

Analysis of Matching Media Effect on Microwave Brain Stroke Imaging via a Spherically Symmetrical Head Model

B.S. in Mathematical Engineering

Istanbul Technical University

Sep 2005 – May 2010

Model Parameter Estimates from Dynamic PET Data

Skills

Programming

Python· ExpertMATLAB· ExpertR· AdvancedC/C++· AdvancedC#· AdvancedSQL· AdvancedJava· AdvancedGit· ExpertDocker· Advanced

Sensor Algorithms & Signal Processing

Resampling· ExpertSpectral/Wavelet Features· ExpertSQI Estimation· ExpertMotion-Artifact Suppression· ExpertPeak Detection· ExpertSegmentation· ExpertKalman/Particle Filtering Fusion· ExpertHRV Metrics (RMSSD, LF/HF)· ExpertMorphological Feature Extraction Techniques· Expert

ML & Modeling

SVM· ExpertkNN· ExpertRF· ExpertHMM· ExpertCNN/RNN/LSTM· ExpertSelf-Supervised Time-Series Learning· AdvancedStatistical Validation (ANOVA, Bland-Altman, Tukey's HSD Multiple Pairwise Analysis, t-test)· Expert

Realtime Systems

ROS· ExpertUnity· AdvancedSteam VR· AdvancedKaldi ASR· AdvancedNAOqi· AdvancedData Streaming· ExpertTimestamp Synchronization with LSL· Advanced

Deployment

MATLAB/Simulink Integration· ExpertPython Package Development· ExpertReal-time Data Streaming· ExpertROS-based Testing· AdvancedVersion Control (Git)· Expert

Research Methodologies

Study Design· ExpertUX Research· ExpertSensor-based Algorithm Development· Expert

Database/Middleware

Oracle· AdvancedTIBCO· Advanced

Architecture

Distributed Systems· Advanced

Projects

Real-Time Cognitive Workload in Multi-Agent HRI

VREye Gaze TrackingRobot ControlMultimodal HRIMachine LearningReal-time Systems

Led a 2-year study developing a real-time multimodal HRI platform in a VR space station with two humans and two robots. Implemented cognitive-workload prediction from eye gaze integrated into robot control, enabling adaptive behaviors that improved team performance under dynamic, high-demand tasks. Related paper submitted to AAMAS 2026, currently under review.

Multi-Modal Cognitive State Assessment in Driving

EEGArterial Blood PressureEye-Gaze SignalsDriving SimulationMachine LearningSignal Processing

Analyzed EEG, arterial blood pressure, and eye-gaze signals to predict human cognitive workload in a multitasking driving simulation. Demonstrated that pupil diameter alone was the most reliable predictor, validating efficient real-time workload estimation approaches. Results published in Sensors, CogSci, and ACM ICMI, with the dataset paper accepted to Nature Scientific Data (first author, in press).

Robot-Assisted Sign-Language Learning

NAO-robotGesture RecognitionCognitive WorkloadAdaptive Systems

Designed a NAO-robot tutoring platform to study gesture-based interactions and cognitive workload during sign language training. Results showed adaptive robot feedback can optimize learning and memory, informing personalized robotic tutor design.

Motion-Robust HRV and IBI Estimation

PPGECGSignal ProcessingAlgorithm DevelopmentHRVIBIMachine Learning

Developed a robust algorithm for estimating IBI and HRV from motion-corrupted wearable cardiac signals using a shortest-path heartbeat detection and multi-feature fusion approach. Validated on PPG and ECG data under exercise conditions; funded by NIH, NSF, and PATHS-UP. Results published in IEEE JBHI and presented at IEEE EMBS BHI.

Disease Transmission Dynamics Modeling

MATLABPythonDisease ModelingSimulationNetwork Analysis

Modeled disease transmission dynamics over meta-population networks; implemented in MATLAB/Python, validated on simulated mobility datasets to evaluate control-policy efficiency.

Bowel-Sound Event Detection Pipeline

Machine LearningSignal ProcessingStethoscope Recordings

Developed machine-learning pipeline for detecting bowel-sound events such as mass peristalsis from stethoscope recordings.

Languages

English

Expert · Professional

Turkish

Expert · Native

French

Beginner · Elementary

Awards

CogSci Family Grant

Tufts/CogSci

Jan 2024

Departmental Travel Grant

Tufts

Jan 2024

Graduate Merit Scholarship

TAMU

Jan 2020

NSF Registration Award

NSF

Jan 2019

IEEE EMBS Travel Grant

IEEE EMBS

Jan 2019

Selected Publications

Assessment of Multiple Systemic Human Cognitive States using Eye Gaze

A. Aygun, T. Nguyen, M. Scheutz, CogSci, 2024.

Estimating Systemic Cognitive States from a Mixture of Physiological and Brain Signals

M. Scheutz, S. Aeron, A. Aygun, et al., Topics in Cognitive Science, 2023.

Cognitive Workload Assessment via Eye Gaze and EEG in an Interactive Driving Task

A. Aygun, B. Lyu, T. Nguyen, S. Aeron, M. Scheutz, ICMI, 2022.

Robust Interbeat Interval and HRV Estimation from Motion-Distorted PPG/ECG

A. Aygun, H. Ghasemzadeh, R. Jafari, IEEE JBHI, 2019.

Professional Service

Reviewer

HCI Conference, HRI Conference, THRI, IEEE JBHI, Nature Scientific Reports, IEEE TBioCas, Plos One, THRI, CogSci 2025, Psychophysiology

Guest Lecturer

Introduction to Cognitive & Brain Science, Tufts University (2025)

Mentor

Sophia Gu, Helena Fu, Acelya Deniz Gungordu, Peiman Mohseni (2019-Present)

All Profiles
Ayca Aygun profile photo

Ayca Aygun

MA, USA

🎓 Ph.D. in Computer Science (Tufts University, 2025) with 10+ years of applied research experience at the intersection of Human-Computer Interaction (HCI), cognitive science, and physiological sensing. My work is driven by a core goal: to better understand human physiological and mental states through computational methods and real-time sensing technologies. 🧠 I specialize in physiological signal processing and multimodal cognitive state tracking, leveraging signals such as PPG, ECG, EEG, eye gaze, respiration, and skin conductance. These capabilities support applications in adaptive robotics, human-system interaction, and intelligent user interfaces. 🧪 I have led the end-to-end design and execution of human-participant studies, including IRB protocol development, participant recruitment, experimental design, data collection, and analysis. My research has been conducted in collaboration with interdisciplinary teams and funded by major organizations such as AFOSR, NSF, NIH, and PATHS-UP. 📊 Technically proficient in Python, R, C#, SQL, PL/SQL, Javascript, and MATLAB, with deep experience in time-series modeling, statistical analysis, data visualization, machine learning, and deep learning. I also bring strong skills in UX research, sensor-based algorithm development, and cross-functional project management. 👩‍🏫 In parallel to my research, I’ve served as a teaching assistant for undergraduate and graduate-level courses in Electrical Circuits and Machine Learning, and as a guest lecturer for Introduction to Cognitive & Brain Science. I have also mentored students across multiple labs on both theoretical concepts and practical research methods. 📚 My work has been published in top venues including IEEE JBHI, IEEE EMBS BHI, ICMI, Sensors, and CogSci.

About

Ph.D. in Computer Science (Tufts University, 2025) with 10+ years of applied research experience at the intersection of HCI, cognitive science, and physiological sensing. Led end-to-end human-participant studies, including IRB protocols, recruitment, experimental design, data collection, and cross-functional project management. Developed real-time multimodal systems for cognitive state tracking using PPG, ECG, EEG, eye gaze, respiration, and skin conductance, with applications in adaptive robotics and human-system interaction. Extensive experience in Python, R, MATLAB, C#, SQL, and JavaScript for time-series modeling, statistical analysis, visualization, machine learning, and deep learning. Skilled in study design, UX research, and sensor-based algorithm development. Published in top venues (IEEE JBHI, ICMI, CogSci) and led projects funded by AFOSR, NSF, NIH, and PATHS-UP, regularly presenting findings to senior stakeholders and funding agencies.

Experience

Graduate Research/Teaching Assistant

Human-Robot Interaction Lab., Tufts University

Jun 2021 – Jun 2025Medford, MAFull-Time

Real-Time Cognitive Workload Estimation in VR HRI Teams: Led the design and development of a real-time multimodal HRI platform with two humans and two robots working within a Unity3D VR space station. Developed eye-gaze–based workload prediction integrated into the DIARC robotic architecture to enable adaptive robot behavior. Demonstrated that online cognitive-state monitoring improves team coordination and performance during high-demand tasks. Related paper submitted to AAMAS 2026, currently under review. Multimodal Cognitive State Estimation in Driving Simulation: Conducted detailed analysis of eye-gaze, EEG, and arterial blood pressure signals to estimate workload during multitasking driving scenarios. Showed pupil diameter to be the strongest standalone workload predictor, outperforming multimodal signal combinations. Built ML and statistical pipelines to evaluate classification accuracy under varied task conditions. Results published in Sensors, CogSci, and ACM ICMI, with the dataset paper accepted to Nature Scientific Data (first author, in press). Modeling Interactions Between Systemic Cognitive States: Investigated how workload, sense of urgency, and mind wandering interact during multitasking. Using eye-gaze–based PCPS metrics, demonstrated that consecutive urgency events significantly elevate workload. Provided foundational evidence of dynamic cognitive-state inter- actions relevant to human–robot teaming and neurodiverse populations. VACP-Based Workload Modeling from Eye Gaze: Designed a Visual–Auditory–Cognitive–Psychomotor (VACP) workload model derived from eye-gaze behavior in driving simulations. Quantified how pupil dilation corresponds to VACP levels and revealed strong correlations between pupil metrics and cognitive workload, supporting VACP as a viable framework for multitasking environments. Proactive Robot Behaviors for Enhanced Team Performance: Explored how proactive, active, and reactive robot behaviors influence human workload in HRI teams. Developed eye-gaze–based workload evaluation algorithms, applied advanced preprocessing, and performed statistical analyses showing that proactive robot behaviors reduce human workload more effectively than shared mental models alone. Single-Trial ERP-Based Workload Prediction: Developed methods for extracting single-trial ERPs (N1, N2, P3) from EEG signals during dual-task driving experiments. Implemented bandpass filtering, ICA-based blink removal, and Kalman smoothing to isolate trial-specific neural responses, enabling fine-grained, real-time cognitive work- load modeling. Robot-Assisted Cognitive Workload and Learning: Designed a robot-assisted sign-language learning system to study how gesture-based interactions influence cognitive workload and memory. Found that well-structured robot feedback enhances learning and supports the development of personalized robotic tutoring approaches, including applications for neurodiverse learners. Graduate Teaching Assistant: Introduction to Machine Learning: Supported course delivery for two semesters (Spring 2024 and Spring 2025) for 220+ undergraduate and graduate students. Collaborated with the instructor to prepare exams, assignments, and projects graded assessments, projects, and exams developed the course website and syllabus and held weekly office hours to help students grasp core ML/DL concepts.

Graduate Research Assistant

Texas A&M University

Jan 2018 – May 2021College Station, TXFull-Time

Networked Multiagent Systems Lab., Modeling Infectious Disease Transmission in Meta-population Networks: Developed and analyzed a structured meta-population model to study how local preparedness and awareness levels affect outbreak size across sequentially connected localities. Showed that when each locality adapts based on its immediate neighbor—rather than the origin—outbreak impacts are minimized and distant localities experience significantly fewer cases. Embedded Signal Processing Lab., Robust IBI and HRV Estimation from Wearable Cardiac Sensors: Developed a robust method to accurately predict interbeat interval (IBI) and heart rate variability (HRV) from wearable cardiac signals affected by mo- tion artifacts. The approach combines two key algorithmic components: (1) a shortest-path heartbeat detection algorithm leveraging morphological features and time-continuity of heartbeats, and (2) a fusion technique that integrates estimates from multiple features for improved accuracy. This work was supported by NIH, NSF, and PATHS-UP, and validated on PPG and ECG data under exercise conditions. Integrated Neuro-Prosthesis Lab., Biomedical Signal Feature Extraction and Classification of Bowel Sounds: Recorded long-duration bowel sounds using a specialized stethoscope and analyzed the signals to identify mass peristalsis events and their tim- ing. Developed methods to extract key features and classify bowel sounds, enabling non-invasive monitoring of gastrointestinal activity. This work provides valuable insights for bedridden patients, supporting better clinical assessment and patient care.

Senior Software Developer

Ing Bank

Jun 2016 – Aug 2017Istanbul, TurkiyeFull-Time

Focus: Design and development of Tibco/Oracle web services, user interfaces, and stored procedures at database side used for intranet applications.

Senior Middleware Systems Engineer

Akbank

Mar 2015 – May 2016Istanbul, TurkiyeFull-Time

Focus: Administration of Tibco/Oracles web services. Development of shell scripts which are used to monitor the application servers

Middleware Systems Engineer

Turk Telekom

Jun 2012 – Mar 2015Istanbul, TurkiyeFull-Time

Focus: Administration of Weblogic Server 11g/8.1 and SOA Suite 11g with development and deployment of the applications.

Software Developer

I2I Systems

Feb 2011 – Nov 2011Istanbul, TurkiyeFull-Time

Focus: Development of software packages related to billing and rating systems.

Software Developer

Ericsson

Jul 2009 – Feb 2011Istanbul, TurkiyeFull-Time

Focus: Development of software packaged related to costumer relationship management (CRM) and human re- sources management (HRM) systems.

Education

Ph.D. in Computer Science

Tufts University

Jun 2021

Dissertation: Multimodal Physiological Signal Processing for Cognitive State Estimation in Real-Time HRI.

M.S. in Electrical and Computer Engineering

Texas A&M University

Jan 2018 – May 2021

Interbeat Interval and Heart Rate Variability Estimation Method from Various Morphological Features using Wearable Sensors

M.S. in Biomedical Engineering

Istanbul Technical University

Sep 2010 – May 2015

Analysis of Matching Media Effect on Microwave Brain Stroke Imaging via a Spherically Symmetrical Head Model

B.S. in Mathematical Engineering

Istanbul Technical University

Sep 2005 – May 2010

Model Parameter Estimates from Dynamic PET Data

Skills

Programming

Python· ExpertMATLAB· ExpertR· AdvancedC/C++· AdvancedC#· AdvancedSQL· AdvancedJava· AdvancedGit· ExpertDocker· Advanced

Sensor Algorithms & Signal Processing

Resampling· ExpertSpectral/Wavelet Features· ExpertSQI Estimation· ExpertMotion-Artifact Suppression· ExpertPeak Detection· ExpertSegmentation· ExpertKalman/Particle Filtering Fusion· ExpertHRV Metrics (RMSSD, LF/HF)· ExpertMorphological Feature Extraction Techniques· Expert

ML & Modeling

SVM· ExpertkNN· ExpertRF· ExpertHMM· ExpertCNN/RNN/LSTM· ExpertSelf-Supervised Time-Series Learning· AdvancedStatistical Validation (ANOVA, Bland-Altman, Tukey's HSD Multiple Pairwise Analysis, t-test)· Expert

Realtime Systems

ROS· ExpertUnity· AdvancedSteam VR· AdvancedKaldi ASR· AdvancedNAOqi· AdvancedData Streaming· ExpertTimestamp Synchronization with LSL· Advanced

Deployment

MATLAB/Simulink Integration· ExpertPython Package Development· ExpertReal-time Data Streaming· ExpertROS-based Testing· AdvancedVersion Control (Git)· Expert

Research Methodologies

Study Design· ExpertUX Research· ExpertSensor-based Algorithm Development· Expert

Database/Middleware

Oracle· AdvancedTIBCO· Advanced

Architecture

Distributed Systems· Advanced

Projects

Real-Time Cognitive Workload in Multi-Agent HRI

VREye Gaze TrackingRobot ControlMultimodal HRIMachine LearningReal-time Systems

Led a 2-year study developing a real-time multimodal HRI platform in a VR space station with two humans and two robots. Implemented cognitive-workload prediction from eye gaze integrated into robot control, enabling adaptive behaviors that improved team performance under dynamic, high-demand tasks. Related paper submitted to AAMAS 2026, currently under review.

Multi-Modal Cognitive State Assessment in Driving

EEGArterial Blood PressureEye-Gaze SignalsDriving SimulationMachine LearningSignal Processing

Analyzed EEG, arterial blood pressure, and eye-gaze signals to predict human cognitive workload in a multitasking driving simulation. Demonstrated that pupil diameter alone was the most reliable predictor, validating efficient real-time workload estimation approaches. Results published in Sensors, CogSci, and ACM ICMI, with the dataset paper accepted to Nature Scientific Data (first author, in press).

Robot-Assisted Sign-Language Learning

NAO-robotGesture RecognitionCognitive WorkloadAdaptive Systems

Designed a NAO-robot tutoring platform to study gesture-based interactions and cognitive workload during sign language training. Results showed adaptive robot feedback can optimize learning and memory, informing personalized robotic tutor design.

Motion-Robust HRV and IBI Estimation

PPGECGSignal ProcessingAlgorithm DevelopmentHRVIBIMachine Learning

Developed a robust algorithm for estimating IBI and HRV from motion-corrupted wearable cardiac signals using a shortest-path heartbeat detection and multi-feature fusion approach. Validated on PPG and ECG data under exercise conditions; funded by NIH, NSF, and PATHS-UP. Results published in IEEE JBHI and presented at IEEE EMBS BHI.

Disease Transmission Dynamics Modeling

MATLABPythonDisease ModelingSimulationNetwork Analysis

Modeled disease transmission dynamics over meta-population networks; implemented in MATLAB/Python, validated on simulated mobility datasets to evaluate control-policy efficiency.

Bowel-Sound Event Detection Pipeline

Machine LearningSignal ProcessingStethoscope Recordings

Developed machine-learning pipeline for detecting bowel-sound events such as mass peristalsis from stethoscope recordings.

Languages

English

Expert · Professional

Turkish

Expert · Native

French

Beginner · Elementary

Awards

CogSci Family Grant

Tufts/CogSci

Jan 2024

Departmental Travel Grant

Tufts

Jan 2024

Graduate Merit Scholarship

TAMU

Jan 2020

NSF Registration Award

NSF

Jan 2019

IEEE EMBS Travel Grant

IEEE EMBS

Jan 2019

Selected Publications

Assessment of Multiple Systemic Human Cognitive States using Eye Gaze

A. Aygun, T. Nguyen, M. Scheutz, CogSci, 2024.

Estimating Systemic Cognitive States from a Mixture of Physiological and Brain Signals

M. Scheutz, S. Aeron, A. Aygun, et al., Topics in Cognitive Science, 2023.

Cognitive Workload Assessment via Eye Gaze and EEG in an Interactive Driving Task

A. Aygun, B. Lyu, T. Nguyen, S. Aeron, M. Scheutz, ICMI, 2022.

Robust Interbeat Interval and HRV Estimation from Motion-Distorted PPG/ECG

A. Aygun, H. Ghasemzadeh, R. Jafari, IEEE JBHI, 2019.

Professional Service

Reviewer

HCI Conference, HRI Conference, THRI, IEEE JBHI, Nature Scientific Reports, IEEE TBioCas, Plos One, THRI, CogSci 2025, Psychophysiology

Guest Lecturer

Introduction to Cognitive & Brain Science, Tufts University (2025)

Mentor

Sophia Gu, Helena Fu, Acelya Deniz Gungordu, Peiman Mohseni (2019-Present)