FINN is a fast-growing global remote fintech startup headquartered in Singapore, building solutions to transform financial services. Our founding team brings experience from multiple successful ventures.
We're on a mission to transform financial well-being for underbanked employees. Our product provides early wage access, financial education, budgeting tools, part-time job opportunities, and micro-insurances. We've hit product-market fit with explosive growth during the past 12 months.
- Flexible working hours (core daylight overlap in your timezone, preferably UTC+0 to UTC+8)
- Collaborative, developer-focused, high-ownership culture where technical excellence matters
About the Role
We're seeking our first Data Platform Engineer to establish and evolve our data infrastructure foundation. This is a unique opportunity to build our data platform from the ground up, starting with essential infrastructure improvements and progressing toward a modern data mesh architecture. You'll be instrumental in transforming how our 50-person FinTech processes and democratizes data across the organization.
What You'll Do
Immediate Priorities (First 3-6 months)
- Implement and configure BigQuery policy tagging for data governance and access control - this means creating Terraform/code templates to allow Data Engineers to create tables with policy tagging configured
- Set up dbt infrastructure and establish transformation pipelines and best practices - enabling Data Engineers to create and manage their own transformations
- Design and implement core data pipeline infrastructure for reliable data ingestion and processing
- Establish monitoring, alerting, and observability for data platform components
- Create documentation and runbooks for platform operations
Platform Building (6-12 months)
- Design and implement self-service data infrastructure enabling teams to own their data products
- Build tooling and automation to reduce friction in data pipeline development
- Establish data quality frameworks and automated testing infrastructure
- Create platform APIs and interfaces for seamless integration with AWS backend services
- Implement cost optimization strategies for BigQuery and GCP resources
Data Mesh Evolution (12+ months)
- Lead the transition to data mesh principles, starting with pilot domains
- Design federated computational governance model
- Build self-serve data platform capabilities
- Establish data product thinking and standards across engineering teams
Technical Requirements
Must Have
- 3+ years of experience building data infrastructure in cloud environments
- Strong expertise with GCP BigQuery (partitioning, clustering, policy tags, IAM)
- Hands-on experience with Infrastructure as Code (Terraform preferred)
- Proficiency in Python and SQL for data engineering
- Experience with orchestration tools (Airflow, Prefect, or similar)
- Understanding of data governance, security, and compliance requirements
Strongly Preferred
- Experience with dbt for data transformation and modeling
- Knowledge of streaming data pipelines (Pub/Sub, Dataflow)
- Familiarity with data mesh concepts and domain-driven design
- Experience in FinTech or regulated industries
- Background working in startups or scaling environments
What Makes You Successful
- Builder Mindset: You're energized by creating foundations others will build upon
- Pragmatic Approach: You can balance perfect architecture with shipping working solutions
- Strong Communication: You can explain complex technical concepts to diverse stakeholders
- Self-Directed: You thrive in ambiguity and can define your own roadmap
- Collaborative Spirit: You enjoy enabling other teams and treating them as customers
Our Stack
- Data Infrastructure: GCP BigQuery, Cloud Storage
- Backend: AWS services
- Orchestration: [To be determined by you]
- Transformation: dbt (to be implemented)
- Languages: Python, SQL
- IaC: Terraform
- Methodology: Agile/Scrum