Please note:- NO C2C, NO C2H & NO 1099. Fulltime Opportunity - Should work on Infinite Payroll / W2 only Overview: The Data & ETL Architect is responsible for defining the end-to-end data architecture, establishing data governance standards, and designing scalable ETL/ELT pipelines across enterprise systems. This role ensures seamless data integration, high performance, and efficient data modeling to support analytics, reporting, and operational needs. Key Responsibilities: Define the enterprise-wide data architecture strategy, including data modeling, storage, ETL/ELT frameworks, and data integration patterns. Architect and optimize scalable data pipelines using modern ETL/ELT tools and cloud-native services. Establish strong data governance practices covering data quality, security, lineage, metadata, and compliance. Collaborate with business stakeholders, solution architects, and development teams to convert business requirements into technical designs. Design logical and physical data models to support analytics, BI, ML, and transactional systems. Lead the modernization of legacy ETL workloads into cloud-based architectures (GCP, AWS, Azure, etc.). Ensure data scalability, performance tuning, and optimization of ETL pipelines and data stores. Define integration strategies across systems (batch, streaming, API-based ingestion, event-driven architectures). Evaluate and recommend modern data technologies, ETL tools, and best practices. Review code, perform architecture governance, and provide mentorship to data engineering teams. Develop standards, patterns, and reusable components for enterprise data platforms. Ensure end-to-end data security including encryption, access controls, and compliance with organizational policies. Required Skills & Experience: 8+ years of experience in Data Engineering, ETL Development, or Data Architecture. Strong proficiency in ETL/ELT tools (Informatica, Talend, DataStage, Matillion, DBT, Glue, Dataflow, etc.). Hands-on experience with cloud data platforms (GCP BigQuery, AWS Redshift/S3/Glue, Azure Synapse/Data Factory). Expertise in SQL, data modeling (Kimball, Inmon, Data Vault), and schema design for analytical and transactional systems. Experience designing scalable batch and streaming pipelines (Kafka, Pub/Sub, Kinesis, Spark Streaming). Solid understanding of data governance, metadata management, lineage, master data concepts, and security. Strong knowledge of Python or Java for ETL automation and data processing. Familiarity with containerization and orchestration (Docker, Kubernetes, Airflow, Cloud Composer). Ability to translate business requirements into data architecture designs. Excellent communication and documentation skills. Preferred Qualifications: Cloud certifications (GCP Professional Data Engineer, AWS Data Analytics, Azure Data Engineer). Experience with ML feature store architectures, data lakehouse models, or real-time data processing. Experience working in telecom, BFSI, retail, or large-scale enterprise environments.
Read Less