For employers

Hire this AI Trainer

Sign in or create an account to invite AI Trainers to your job.

Invite to Job
R

Romanowski Dana

Data Annotator II — Aperture AI

ExpertCVATLabelboxLabelimg

Key Skills

Software

CVATCVAT
LabelboxLabelbox
LabelImgLabelImg

Top Subject Matter

Autonomous Driving
General Object Detection
E-commerce NER

Top Data Types

ImageImage
TextText
AudioAudio
DocumentDocument

Top Task Types

Bounding Box
Entity Ner Classification
Transcription
Data Collection

Freelancer Overview

Data Annotator II — Aperture AI. Brings 10+ years of professional experience across legal operations, contract review, compliance, and structured analysis. Core strengths include CVAT, Labelbox, and LabelImg. Education includes Bachelor of Science, University of Washington (2018). AI-training focus includes data types such as Image, Text, and Audio and labeling workflows including Bounding Box, Entity (NER) Classification, and Transcription.

Expert

Labeling Experience

CVAT

Data Annotator II — Aperture AI

CVATImageBounding Box
I led annotation and quality assurance efforts for over 25 computer vision and multimodal projects, producing and validating more than 240,000 annotations including bounding boxes, semantic segmentation, polygon masks, and key points. I designed and enforced annotation guidelines and edge-case rules, leading to measurable improvements in label quality and inter-annotator agreement. Working closely with ML engineers, I implemented relabeling strategies and built semi-automated pipelines to accelerate throughput and reproducibility. • Produced, validated, and QA'd 240,000+ image and video annotations using a combination of bounding boxes, segmentation, and key point labels. • Developed and standardized annotation schemas and guidelines, reducing labeling disputes and increasing annotation agreement metrics. • Built semi-automated preprocessing and labeling pipelines utilizing CVAT, Python, and AWS S3. • Improved autonomous driving dataset quality and model performance by spearheading targeted relabeling and comprehensive QA audits.

I led annotation and quality assurance efforts for over 25 computer vision and multimodal projects, producing and validating more than 240,000 annotations including bounding boxes, semantic segmentation, polygon masks, and key points. I designed and enforced annotation guidelines and edge-case rules, leading to measurable improvements in label quality and inter-annotator agreement. Working closely with ML engineers, I implemented relabeling strategies and built semi-automated pipelines to accelerate throughput and reproducibility. • Produced, validated, and QA'd 240,000+ image and video annotations using a combination of bounding boxes, segmentation, and key point labels. • Developed and standardized annotation schemas and guidelines, reducing labeling disputes and increasing annotation agreement metrics. • Built semi-automated preprocessing and labeling pipelines utilizing CVAT, Python, and AWS S3. • Improved autonomous driving dataset quality and model performance by spearheading targeted relabeling and comprehensive QA audits.

2021 - Present
Labelbox

Data Annotator — PixelSense Labs

LabelboxAudioTranscription
I executed audio transcription QA projects, validating and correcting hundreds of hours of conversational audio for voice assistant and ASR model training. My responsibilities included verifying transcriptions, annotating speaker turns, and designing specialized tags for noise segments. My efforts directly contributed to reduced transcription error rates and improved model conditioning in challenging acoustic environments. • Led QA for transcription and diarization across 200+ hours of conversational audio data. • Developed annotation specifications for speaker turns and noise labeling. • Provided feedback and flagged audio segments for targeted model retraining. • Built reporting dashboards in Excel and SQL to track audio QA progress and trends.

I executed audio transcription QA projects, validating and correcting hundreds of hours of conversational audio for voice assistant and ASR model training. My responsibilities included verifying transcriptions, annotating speaker turns, and designing specialized tags for noise segments. My efforts directly contributed to reduced transcription error rates and improved model conditioning in challenging acoustic environments. • Led QA for transcription and diarization across 200+ hours of conversational audio data. • Developed annotation specifications for speaker turns and noise labeling. • Provided feedback and flagged audio segments for targeted model retraining. • Built reporting dashboards in Excel and SQL to track audio QA progress and trends.

2018 - 2021
Labelbox

Data Annotator — PixelSense Labs

LabelboxTextEntity Ner Classification
I managed high-volume text annotation projects, including developing comprehensive annotation specifications for named entity recognition and attribute extraction within e-commerce data. I trained annotators, conducted adjudication, and analyzed error patterns to improve model quality. My focus on documentation and QA led to significant increases in inter-annotator agreement and F1 scores for downstream NLP models. • Defined NER schema and guidelines and created thousands of sentence-level text labels. • Led batch QA, monitored label distributions, and iteratively updated guidelines to address error patterns. • Achieved substantial improvements in F1 metrics through systematic guideline refinement and adjudication. • Standardized and converted multiple text and annotation formats for downstream ML pipelines.

I managed high-volume text annotation projects, including developing comprehensive annotation specifications for named entity recognition and attribute extraction within e-commerce data. I trained annotators, conducted adjudication, and analyzed error patterns to improve model quality. My focus on documentation and QA led to significant increases in inter-annotator agreement and F1 scores for downstream NLP models. • Defined NER schema and guidelines and created thousands of sentence-level text labels. • Led batch QA, monitored label distributions, and iteratively updated guidelines to address error patterns. • Achieved substantial improvements in F1 metrics through systematic guideline refinement and adjudication. • Standardized and converted multiple text and annotation formats for downstream ML pipelines.

2018 - 2021

Annotation Specialist (Contract) — Freelance

TextData Collection
I automated repetitive cleaning and preparation of text corpora to facilitate efficient and accurate downstream annotation. My work ensured textual data was standardized and ready for team-based annotation projects. By employing regex and pandas, I delivered rapid project readiness for a variety of annotation tasks. • Cleaned and preprocessed text datasets for annotation. • Automated data wrangling tasks to save time. • Collaborated across teams to ensure text readiness for NLP labeling. • Delivered projects under 1–3-week timeframes for startups and academic labs.

I automated repetitive cleaning and preparation of text corpora to facilitate efficient and accurate downstream annotation. My work ensured textual data was standardized and ready for team-based annotation projects. By employing regex and pandas, I delivered rapid project readiness for a variety of annotation tasks. • Cleaned and preprocessed text datasets for annotation. • Automated data wrangling tasks to save time. • Collaborated across teams to ensure text readiness for NLP labeling. • Delivered projects under 1–3-week timeframes for startups and academic labs.

2017 - 2018
LabelImg

Annotation Specialist (Contract) — Freelance

LabelimgImageBounding Box
I provided contract annotation services for medical imaging and other startup projects, performing image labeling using bounding boxes and segmentation techniques. I developed lightweight QA checklists and automated repetitive preprocessing tasks to accelerate turnaround. My work ensured timely, accurate delivery of high-quality annotated data across diverse domains. • Labeled 5,000+ medical images using bounding boxes and segmentation. • Designed and executed rapid QA workflows for short project deadlines. • Automated data cleaning for annotation readiness to speed up deliverables. • Supported both medical and general startup imaging needs in a freelance capacity.

I provided contract annotation services for medical imaging and other startup projects, performing image labeling using bounding boxes and segmentation techniques. I developed lightweight QA checklists and automated repetitive preprocessing tasks to accelerate turnaround. My work ensured timely, accurate delivery of high-quality annotated data across diverse domains. • Labeled 5,000+ medical images using bounding boxes and segmentation. • Designed and executed rapid QA workflows for short project deadlines. • Automated data cleaning for annotation readiness to speed up deliverables. • Supported both medical and general startup imaging needs in a freelance capacity.

2017 - 2018

Education

U

University of Washington

Bachelor of Science, Computer Science

Bachelor of Science
2014 - 2018

Work History

A

Aperture AI, Seattle, WA

Data Annotator II

Location not specified
2021 - Present
P

PixelSense Labs (Remote)

Data Annotator

Location not specified
2018 - 2021