Mô Tả Công Việc
About the Role/position
The ideal candidate is an experienced data pipeline builder and data wrangler, will act as technical data engineering expert in international team (multilocation). He/she will actively participate in optimizing data systems or building them from the ground up for our oversee customers.
Responsibilities:
The Data engineer will be responsible for:
- Develop and maintain data pipelines using ETL processes.
- Work closely with data science team to implement data analytics pipelines.
- Maintain security and data privacy, working closely with data protection officer.
- Implement scalable architectural models for data processing and storage.
- Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode.
- Help in scoping, estimation, and planning for various projects in the enterprise.
- Provide technical support to project teams as needed.
Yêu Cầu Công Việc
Must have Technical Requirements / Qualifications
- B.S. in Computer Science, related fields or commensurate work experience.
- 5+ years of experience in software development with 3+ years of experience of relevant data engineering like Spark, PySpark, Hive, HDFS, Pig… and ETL with large amounts of data.
- Solid knowledge and experience of data processing languages, such as SQL, Python, and/or Scala.
- Hand on experience with real-time data stream platforms such as Kafka and Spark Streaming.
- Knowledge and experience on both relational databases (Oracle), NoSQL databases (e.g. MongoDB) and strong SQL querying skills, Performance Tuning are required.
- Experience on complex regulatory data integration projects.
- Agile based delivery knowledge
- Excellent English communication – verbal, written, and presentation skills.
- Strong teambuilding skills and teamwork orientation.
- Strong creative problem-solving skills.
Nice to have Technical Requirements / Qualifications
- DP-203 – Data Engineering on Microsoft Azure certificate.
- Knowledge of at least one cloud environment (Azure, GCP, AWS, IBM)
- Experience on Data Warehouse such as Teradata SQL, Informatica, Unix and Control-M.
- Experience of data visualisation tools (e.g Tableau, Quantexa and SAS).
- Experience on creating Slowly Changing Dimension type data tables in Hive using Spark framework
Hình thức
Quyền Lợi
- Competitive salary, health insurance covered for employee and dependents
- Working on international projects. Professional and dynamic working environment
- Achieving valuable experience with variety projects, new technologies and hundreds of talents
- Receiving training opportunities including many technical seminars and soft skill training courses
- Good opportunity for promotion through regular performance review system
- Hybrid work