AI/LLM Data Normalization Pipeline Developer
As Senior Software Developer, I integrated AI/LLM-based normalization pipelines to standardize heterogeneous provider metadata. My work involved improving semantic search accuracy and using AI pipelines to convert unstructured provider data into uniform, structured formats ready for downstream applications. I collaborated with AI services for the normalization of social media marketing provider records, which contributed to notable platform conversion improvements. • Designed, implemented, and evaluated pipelines for AI-driven metadata normalization • Improved search and recommendation accuracy by standardizing textual provider records • Collaborated on data preparation and AI workflow tuning for production SaaS environments • Leveraged Java, Spring, Elasticsearch, and Python for AI-related backend development