About AGEST Vietnam
AGEST Vietnam (AGV), a world-class software testing, test automation, and software development provider, is an affiliate company of AGEST Inc., Japan. We have a high profile of 20 years+ operating in Vietnam. Our mission today is to evolve a safe and secure digital society and support its further development by contributing to the development of an advanced digital society by providing next-generation Software Development, Big Data & AI, and Quality Assurance solutions.
============================
We're looking for a Senior Data Engineer to architect and maintain cutting-edge data systems that power analytics, AI, and operational decision-making. In this role, you'll take ownership of end-to-end data lifecycles, designing pipelines, models, and architectures that support real-time insights and machine learning at scale.
Responsibilities
- Design Scalable Data Architecture: Build modern, cloud-native data platforms (AWS, Snowflake, Databricks) supporting batch and streaming use cases.
- Develop Efficient Data Pipelines & Models: Automate ETL/ELT workflows, optimise data models, and enable self-serve analytics and AI.
- End-to-End Data Ownership: Manage ingestion, storage, processing, and delivery of structured and unstructured data.
- Performance & Cost Optimisation: Continuously tune infrastructure for high concurrency, low latency, and cost efficiency.
- Real-Time Integration & Analytics: Ingest telemetry, API, and application data in real time to power dashboards and AI-driven tools.
- ML & AI Enablement: Provision datasets for ML/AI workloads, integrating with SageMaker, Snowflake ML, and MLOps best practices.
- Data Governance & Security: Ensure robust data governance, compliance (GDPR, SOC 2), and enterprise-grade security.
- Collaboration & Strategy: Work closely with Product, Engineering, DevOps, and Analytics teams to align data solutions with business goals.
Requirements
- Experience: significant experience in technology roles, with 5+ years in data engineering on real-time, scalable cloud platforms (AWS & Snowflake preferred). Experience in SaaS/product companies managing large-scale IoT, telemetry, or digital datasets is highly desirable.
- Technical Expertise:
- AWS (S3, Glue, Lambda, Athena, Kinesis), Snowflake (data pipelines, schema design, query optimization)
- Data modeling, ETL/ELT, real-time streaming (Kafka, Kinesis)
- Big data processing (Spark, Airflow), SQL, Python, Java/Scala
- BI & analytics platforms (Tableau, Looker)
- ML/AI integration (SageMaker, TensorFlow, Snowflake ML, feature stores)
- Data governance, security, and compliance frameworks
- Attributes: Strong communicator, collaborative, analytical, and strategic. Ability to balance multiple projects while driving innovation and operational excellence.
Qualifications
- Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or related field (Master's preferred)
- Relevant certifications (AWS Certified Data Analytics, Solutions Architect, SnowPro Core)