About the position
Responsibilities
• Design, build, and manage data pipelines for the ETL process using Airflow for orchestration and Python for scripting.
• Implement and maintain data synchronization between on-premises Oracle databases and Snowflake using CDC tools.
• Assist in developing and optimizing data models to support real-time decision-making and insights.
• Optimize data flows and analytics to ensure efficient and effective data processing.
• Work closely with cross-functional teams to ensure seamless integration and operation of data systems.
Requirements
• Total 10+ years of experience as a data engineer; 5+ years of experience as a Snowflake data engineer.
• Strong knowledge of SQL, writing complex queries, performance tuning, and experience with Oracle, Snowflake, ETL/ELT tools.
• Experience in Airflow for managing data pipeline workflows.
• Proficiency in Python and SQL for data processing tasks.
• Knowledge of cloud infrastructure and services, preferably Azure, as it relates to Snowflake integration.
Apply Now
Apply Now