AI Data Annotator – Web Page Quality Assessment
As an active contributor to the UHRS platform via Clickworker, I participated in AI data labeling tasks focused on web page and content quality evaluation. My responsibilities included judging the relevance, usefulness, and accuracy of search results and online content for AI model training. I successfully completed training qualifications and consistently applied objective evaluation criteria while annotating web-based information. • Assessed web search results for relevance and accuracy • Used established checklists and annotation standards • Collaborated with global team guidelines for AI data work • Maintained a reliable task completion record