Mô Tả Công Việc
Global Fashion Group is the leading fashion and lifestyle destination in growth markets across LATAM, SEA and ANZ. Our three e-commerce platforms: THE ICONIC, ZALORA and Dafiti connect an assortment of international, local and own brands to millions of consumers from diverse cultures and lifestyles. Powered by our best-in-class operational infrastructure, which is fashion-specific and tailored to local market needs, our platforms provide an inspiring and seamless end-to-end customer experience. We stand for benchmark-setting customer service, delivery options, returns policies, and curation of brands.
About the function
At GFG, Technology is driven by innovation and quality is highly valued. Our data team is the driving force behind our business strategies and decisions. We are integrated into all departments to ensure all employees at THE ICONIC have access to good quality and timely data.
Our Data Engineering team solves complex problems and delivers data to propel our business forward, powering the insights that are used in every decision we make. We are the engineers, the builders, the maintenance people for all our business data.
Key tech you’ll play with in this role
- Both AWS and GCP
- BigQuery, SQL Server and Redshift
- Docker & Kubernetes
- Airflow, Cloud Composer and Pentaho
- Cloud Dataflow/Apache Beam
- High velocity streaming data, behavioural data and well as structured and unstructured data
What you will do
- Develop and support our enterprise data warehouses, analytical databases and infrastructure
- Work with the team to re-platform our existing data architecture to next generation tooling and data architecture
- Build/Maintain our data pipelines in Python and SQL to ensure that data is delivered in a timely manner
- Work closely with our Data Scientists and Data Analysts to implement new insights and statistical models
- Assist in developing tools/processes to enable our business to self serve
Yêu Cầu Công Việc
What we are looking for
- Extensive programming knowledge of Data processing within Python
- DataFrames
- Pandas
- Dependency management within Python
- Iterators, producers and consumer knowledge
- Airflow DAG building
- CI/CD - Bamboo Deployment / Delivery Experience
- Strong SQL coding skills
- Advanced Data Engineering Design Skills
- Excellent communication skills
- Significant experience in a similar Data Engineering role
- Strong data warehousing/Data Engineering experience
- Experience with data modelling and complex ETL solutions
- Test and QA Experience in Data Pipelines
Ways to stand out from the crowd
- GCP experience
- BigQuery/Redshift
- Cube/SSAS Experience
- DevOps/CICD
- Docker
- Airflow
- R
Hình thức
Quyền Lợi
- Annual Leave: 15 days
- Sick leave/Mental Health Days: 30 days.
- Occasion leave: 1 day.
- 13th month salary.
- Annual bonus: up to 3 months based salary (depends on company performance).
- Home Workspace support: up to 10mil/person.
- Learning budget up to EUR 500/person.
- Free Linkedin Learning, Udemy account.
- Medical benefits: Social insurance, medical insurance & AON insurance.
- Macbook is provided.
- Support for English class.
- Support Gym membership.
- Additionally, we offer a flat hierarchy, a fun and flexible work environment.