Job DescriptionJob Description
Job Description:
Build and enhance a Databricks-based analytics platform that delivers timely, reliable data for high-impact product analysis.Develop and optimize scalable data pipelines—including Kafka streaming, batch ETL, and API ingestion—to ensure high-quality data availability for customer and business needs.Automate workflows and consolidate siloed data to improve data accessibility and accelerate customer insights.Improve data infrastructure performance, reliability, and scalability to support mission-critical decisions.Collaborate with cross-functional teams in an agile environment to translate customer and product requirements into effective data engineering solutions.Contribute architectural guidance and technical leadership to ensure solutions are aligned with customer outcomes and long-term health systems.This is a hybrid position.
Experience and Qualifications:
U.S. citizenshipBachelor’s degree or four years of experience in lieu of degree12+ years of relevant experience delivering data engineering solutions that drive measurable customer or user value.Expertise with Databricks, Kafka, ETL development, workflow automation, and large-scale data processing.Strong understanding of data architecture, pipeline design, and infrastructure optimization for high performance and reliability.Experience working in agile, customer-focused environments with cross-functional teams.Excellent written and verbal communication skills with the ability to explain technical decisions in terms of customer impact.
Read Less