Independent Freelance Software Testing & AI Support (Data Annotation/AI Training)
Performed structured content evaluation and remote data-annotation for global projects, adhering strictly to project guidelines and standard operating procedures. Maintained dataset consistency by documenting ambiguous cases and providing clear, reproducible reports to project leads. Conducted quality assurance reviews on labeled batches and offered actionable feedback to annotators. • Executed data labeling with high accuracy under asynchronous remote workflows. • Managed labeling batches, tracking progress via shared documentation tools. • Ensured compliance with all safety and bias awareness standards in annotation. • Delivered reliable results with clear communication in report and feedback cycles.