ESSENTIAL DUTIES & RESPONSIBILITIES
1. Data Architecture & Pipeline Engineering
- Design, develop, and maintain ETL/ELT pipelines across diverse data sources (ERP, OMS, logistics, Supplier, SC & market data).
- Build and manage data warehouse layers (Bronze/Silver/Gold) to ensure clean, reusable, and analytics-ready data models.
- Define data modeling standards (fact/dim models, schema design, partitioning) aligned with global analytics needs.
- Build and manage data models that enable scalable analytics and automation.
- Partner with IT and digital teams to ensure smooth data flow and integration across systems (ERP, Azure, Snowflake, Databricks, Power BI).
2. Data Quality, Governance & Reliability
- Establish and monitor data quality rules, validation checks, and error-handling frameworks within data pipelines.
- Manage data warehouse operations, ensuring availability, scalability, and cost optimization.
- Maintain comprehensive documentation of datasets, lineage, and transformation logic.
- Support the implementation of data governance frameworks and metadata cataloging across systems.
3. Collaboration & Business Partnership
- Work closely with analytics, data science, and functional teams (Supply Chain, Retail, Transformation, etc.) to deliver well-structured, business-relevant datasets.
- Translate business and analytical requirements into technical specifications.
- Communicate complex technical concepts in a clear, actionable way to nontechnical partners.
4. Innovation, Automation & Continuous Improvement
- Drive automation and modernization initiatives using cloud and open-source technologies (dbt, Airflow, Delta Lake, CI/CD).
- Automate repetitive data workflows and continuously optimize pipeline performance.
- Drive innovation by evaluating new technologies and best practices in data Warehousing and data Lakehouse architecture.
- Mentor junior engineers and establish coding, testing, and documentation best practices.
MINIMUM QUALIFICATIONS
1. Education
- Bachelor's or Master's degree in Computer Science, Data Engineering, Information Systems, or related field.
- Advanced degree or certification (Azure, Fabric, AWS on Data Analytics) is a plus.
2. Experience
- Minimum 3+ years of experience in Data Engineering, preferably in a Supply Chain, Furniture Industry, or Manufacturing environment.
- Proven track record of leading Data Engineering or Data Warehouse Management initiatives that deliver measurable business value.
- Proven hands-on experience designing, implementing, and maintaining data warehouse architectures (Bronze–Silver–Gold layers) and data Lakehouse platforms.
- Experience in data modeling, data transformation, and performance optimization at enterprise global scale.
- Understanding Supply Chain processes — demand/supply planning, logistics, inventory, and S&OP cycles is a plus.
3. Knowledge, Skills and Abilities
- Strong business acumen and ability to connect data to operational and strategic decisions.
- Proficiency in SQL, Python, ETL frameworks (Airflow, dbt).
- Proficiency in cloud data platforms (Azure, Fabric, Snowflake, Databricks).
- Experience with APIs, data lakes, and CI/CD.
- Strong understanding of data warehouse architecture and multi-layered design (Bronze–Silver–Gold) for ingestion, curation, and presentation.
- Strong stakeholder management skills with the ability to influence across functions and geographies.
- Agile mindset, curiosity, and continuous improvement orientation.