Search by job, company or skills

Ahamove

Lead Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 12 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

You will lead Ahamove's Data Engineering team, responsible for building and scaling our terabyte-scale data warehouse. Our systems handle nearly 200,000 daily orders across both real-time streaming and batch processing pipelines. Your mission is to ensure the reliability, scalability, and performance of our data infrastructure, empowering:

- Real-time dashboards for operational visibility

- Machine Learning services to power intelligent decision-making

- Robust query experiences for internal teams and external stakeholders

Data Infrastructure & Pipeline Development

- Build, maintain, and optimize in-house data infrastructure including databases, data warehouse, orchestration systems, and real-time/batch data pipelines.

- Develop data ingestion pipelines using CDC, streaming, and ETL/ELT frameworks.

- Ensure high data availability, integrity, and consistency across multi-environment systems.

Platform Leadership

- Own the technical architecture, tech-stack, and cost optimization of Ahamove's data platform.

- Establish benchmarks, monitoring, alerting, logging, and auditing for system reliability and scalability.

- Evaluate and integrate emerging data technologies where appropriate.

Cross-functional Collaboration

- Work closely with Product Owners, Software Engineers, Business teams, Data Analysts, and MLEs to solve data-related challenges.

- Design APIs and services to expose data for internal & external use cases.

Team Leadership

- Lead, mentor, and grow the Data Engineering team.

- Drive technical excellence, coding standards, and best practices.

REQUIREMENTS

Must-have

- Bachelor's degree in Computer Science, Software Engineering, Information Systems, or related fields.

- 5+ years of experience in Data Engineering and building scalable data platforms.

- Strong proficiency in Python

- Excellent SQL skills across OLTP/OLAP systems.

- Hands-on experience with cloud platforms (AWS, GCP) and distributed systems.

- Deep understanding of OLTP & OLAP databases such as MongoDB, PostgreSQL, BigQuery, ClickHouse, MotherDuck, etc.

- Experience with streaming platforms: Kafka, Redpanda, RabbitMQ, or similar.

- Knowledge of orchestration tools: Airflow, dbt, Airbyte, etc.

- Strong understanding of version control (GitHub/GitLab).

Nice-to-have

- Experience with big data ecosystems: Hadoop, Spark, Databricks.

- Experience with Kubernetes, Linux, Networking, or DevOps practices.

- Ability to build APIs using Python, Go, or Node.js.

- Familiarity with visualization tools (Metabase, Looker Studio, PowerBI).

- Exposure to emerging open-source data technologies.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 139402415

Similar Jobs