Senior Data Quality Annotator
As a Senior Data Quality Annotator at Cohere, I worked on a large-scale AI model evaluation and data labeling project aimed at improving natural language processing (NLP) model performance and safety. The project involved curating and optimizing high-quality datasets for machine learning (ML) training and inference. My primary responsibilities included: - Data labeling and annotation: Conducted precise annotations for NLP tasks such as intent recognition, sentiment analysis, and response ranking to ensure models understood human language effectively. - Automation pipelines: Developed Python and JavaScript-based automation scripts to streamline data validation, anomaly detection, and quality control, significantly reducing manual effort. - Quality assurance: Implemented data validation techniques to identify inconsistencies, ensuring high accuracy and integrity of AI-generated responses. - Model safety and evaluation: Assessed AI model responses based