Data Annotation Specialist – Egocentric Data
I annotated thousands of hours of egocentric video for AI perception models used in robotics and AR/VR research. My primary tasks included labeling objects, actions, hand-object interactions, and gaze data for these projects. I ensured high accuracy and consistency while collaborating with AI teams to refine annotation guidelines. • Maintained annotation accuracy above 98% across diverse, large-scale datasets. • Utilized CVAT, Label Studio, and custom annotation platforms for labeling work. • Contributed to improvements in guidelines and dataset quality. • Supported multiple AI research teams in achieving their data curation objectives.