Company Overview:Join a boutique Data, Analytics, and AI consulting firm with offices in Sydney and Hanoi. We specialise in delivering tailored, high-quality data solutions across sectors such as manufacturing, software, wholesale, and retail. Leveraging Azure’s suite of services, we help clients harness the power of data to drive strategic decision-making and business transformation.Role Overview:We’re looking for a Data Engineer to support our clients Microsoft Fabric platform. This is a hands-on role focused on building reliable, production-grade data pipelines and curated datasets. While there may be some ad hoc work, the core of the role is structured, scalable data engineering not data science.You’ll work with Dataflows, Pipelines, Notebooks (for ETL, not ML), and Lakehouses to deliver trusted data to analysts and business users.We also welcome programmers with strong SQL and Python skills who are looking to transition into data-focused roles, especially considering the growing demand for data infrastructure alongside AI. Experience with data integration, API ingestion, or working with Apache Spark or Pandas is highly valued.Key Responsibilities:• Build and maintain data pipelines using Microsoft Fabric (Pipelines, Dataflows, Lakehouse)• Write efficient SQL for transformations, aggregation, and data quality checks• Use Python in notebooks for ETL workflows using Pandas or Spark• Design datasets across Bronze, Silver, and Gold layers following medallion architecture• Implement data lineage, monitoring, and validation• Handle some ad hoc data requests while focusing on reusable, maintainable solutions• Apply basic governance practices (e.g. RLS, access controls, metadata tagging)