Facial Emotion Recognition System
Developed a real-time facial emotion recognition system capable of detecting and classifying human emotions from webcam video input. The system uses a fine-tuned MobileNetV2 model for RGB inputs and a custom CNN for grayscale image inputs (48x48), trained on annotated facial expression datasets. I was responsible for the entire data labeling pipeline, including facial ROI extraction, emotion class labeling, and dataset normalization using tools like Label Studio and OpenCV. Integrated a modular architecture for real-time inference, leveraging multi-file project design to separate concerns (model loading, preprocessing, webcam streaming, and emoji display). Emphasized high-accuracy classification across key emotions such as happy, sad, angry, and surprised. This project showcases my ability to annotate, preprocess, and utilize emotion data effectively for deep learning tasks.