For one of our clients, we're looking for an experienced Data Engineer to join a dynamic team working on high-scale data systems (50M+ transactions/day).
What you'll do
- Build, maintain, and optimize scalable data pipelines for large transaction systems
- Develop ETL processes using Python and modern ETL tools
- Design and manage workflows with Apache Airflow
- Work with Hadoop ecosystem (HDFS, Hive, Spark) to ensure performance and reliability
- Collaborate with data scientists and stakeholders to ensure data quality and availability
- Monitor, troubleshoot, and document data pipelines and architecture
What we're looking for
- 8+ years of experience in Data Engineering
- Strong Python skills for data processing
- Hands-on experience with ETL tools (Talend, Informatica, NiFi, etc.)
- Solid knowledge of Apache Airflow
- Experience with big data platforms (Hadoop, Spark, Hive)
- Proven experience handling high-volume transaction systems
- Cloud experience (AWS / Azure / GCP) is a plus
- Strong problem-solving and communication skills