Data labelling using OpenFace for emotions recognition
Created a multimodal engagement dataset for analyzing student behavior during online learning sessions. Designed an experimental setup where students watched curated video clips to induce emotional and engagement responses. Collected multimodal signals including facial expressions, gaze direction, and temporal behavior patterns. Used OpenFace to automatically extract facial action units, gaze vectors, and head pose features. Implemented annotation pipelines to label engagement levels and emotional states across temporal video segments. Structured the dataset to align facial features, gaze signals, and timestamps for training engagement prediction models.