Job DescriptionJob DescriptionJob Description:
We are seeking a highly skilled Senior Data Engineer with 8+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift).
This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads.
The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.
Key Responsibilities:
Data Pipeline & Orchestration
Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelinesImplement best practices for DAG performance, dependency management, retries, SLA monitoring, and alertingOptimize Airflow scheduler, executor, and worker configurations for high-concurrency workloadsdbt Core & Data Modeling
Lead dbt Core implementation, including project structure, environments, and CI/CD integrationDesign and maintain robust dbt models (staging, intermediate, marts) following analytics engineering best practicesImplement dbt tests, documentation, macros, and incremental models to ensure data quality and performanceOptimize dbt query performance for large-scale datasets and downstream reporting needsCloud, Kubernetes & OpenShift
Deploy and manage data workloads on Kubernetes / OpenShift platformsDesign strategies for workload distribution, horizontal scaling, and resource optimizationConfigure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloadsTroubleshoot container-level performance issues and resource contentionPerformance & Reliability
Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platformsIdentify bottlenecks in query execution, orchestration, and infrastructureImplement observability solutions (logs, metrics, alerts) for proactive issue detectionEnsure high availability, fault tolerance, and resiliency of data pipelinesCollaboration & Governance
Work closely with data architects, platform engineers, and business stakeholdersSupport financial reporting, accounting, and regulatory data use casesEnforce data engineering standards, security best practices, and governance policiesRequired Skills & Qualifications:
Experience
10+ years of professional experience in data engineering, analytics engineering, or platform engineering rolesProven experience designing and supporting enterprise-scale data platforms in production environmentsMust-Have Technical Skills
Expert-level Apache Airflow (DAG design, scheduling, performance tuning)Expert-level DBT Core (data modeling, testing, macros, implementation)Strong proficiency in Python for data engineering and automationDeep understanding of Kubernetes and/or OpenShift in production environmentsExtensive experience with distributed workload management and performance optimizationStrong SQL skills for complex transformations and analyticsCloud & Platform Experience
Experience running data platforms on cloud environmentsFamiliarity with containerized deployments, CI/CD pipelines, and Git-based workflowsPreferred Qualifications
Experience supporting financial services or accounting platformsExposure to enterprise system migrations (e.g., legacy platform to modern data stack)Experience with data warehouses (Oracle)
This is a remote position.
Read Less