We are seeking a Senior Data Engineer to drive data infrastructure and analytics in our supply chain and logistics operations. You will design and implement scalable data pipelines, optimize data processing, and ensure seamless data integration across systems. You will also lead the development of core financial data pipelines and models. This role is focused on building trustworthy, granular datasets that support billing systems, financial reconciliations, and business- critical metrics like revenue and gross margin. You will serve as a key partner to
the Finance, team by ensuring data completeness, auditability, and alignment with financial reporting standards. This role will report to our engineering organization
but will have a dotted line reporting relationship to our finance organization.
Key Responsibilities
1. Design, develop, and maintain data pipelines that integrate data from internal billing systems and third-party vendors (e.g., payment processors), transforming raw events into structured, finance-grade datasets, using BigQuery and Dataflow and other ***** APAC data stack as deemed
necessary.
2. Build and manage MLOps workflows to support machine learning models.
3. Design and evolve schemas that power internal tools and external integrations, ensuring compatibility with accounting and compliance requirements.
4. Architect and implement serverless data solutions using JavaScript and Python.
5. Implement validation and reconciliation frameworks to guarantee consistency between internal systems and external platforms (e.g., invoice data vs. processed payments).
6. Partner closely with Finance and Accounting teams to produce reliable data inputs for month-end close, revenue analytics, and key financial metrics like Gross Margin and NRR.
7. Stay up to date with emerging data engineering and cloud technologies.
Qualifications
1. 5+ years of experience in data engineering, preferably in supply chain or logistics.
2. Fluency in both English and Chinese (spoken and written) is required.
3. Expertise in Google Cloud Platform (GCP), especially BigQuery and Dataflow, and other APAC ***** data tech stack.
4. Proficiency in Python and JavaScript for serverless data processing.
5. Experience with MLOps and deploying machine learning models.
Strong knowledge of ETL/ELT processes, data modeling, and orchestration.
6. Have a deep appreciation for data modeling, data correctness, integrity, and traceability, especially in regulated environments.
7. Understand common patterns in financial reconciliation, anomaly detection, and structured data modeling.
8. Enjoy partnering with Finance and Engineering stakeholders to translate operational requirements into clean, auditable datasets.
9. Excellent problem-solving skills and ability to work in a fast-paced
environment.
10. Nice to have Experience with real-time data processing and streaming technologies.
11. Familiarity with CI/CD for data pipelines and infrastructure as co