Turkish Sign Language Gestures Project Data labeling and YOLO (İşaretSensin)
İşaretSensin is an accessibility-focused computer vision project that leverages a YOLOv8-based model to detect and recognize Turkish Sign Language gestures in real time. As part of the project, I created a custom dataset by capturing original images and annotating them using Roboflow, ensuring high-quality and consistent labeling. To improve model performance and generalization, I combined my custom dataset with an existing public dataset and trained a YOLO-based object detection model. This end-to-end process involved data collection, annotation, dataset optimization, and model training, demonstrating strong practical experience in building and scaling AI training data pipelines.