Mô Tả Công Việc
Job Purpose
- Responsible for designing and developing programs, algorithms and automated processes to cleanse, integrate and evaluate large datasets from disparate sources and implement complex business logic as needed with the available data processing tools.
- Responsible for integrating new data sources to increase throughput of existing systems, managing data pipelines that facilitate robust analysis, and sourcing and preparing data to ensure data completeness on metadata platforms.
Responsibilities
Data Architecture
- Deliver functionality required for business and data analysts, data scientists and other business roles to advance the overall analytic performance and strategy of the bank
- Build the best practices and strategies for data infrastructure to fulfill data analytic and utilization needs of the business with emerging latest technologies and capabilities.
- Proactively drive the effort of identifying opportunities to manage data and provide solutions for complex data feeds within the bank.
- Evaluate various data architectures in the bank and utilize them to develop data solutions to meet business requirements.
- Drive the delivery of data products and services into systems and business processes in compliance with internal regulatory requirements.
- Oversee the review of internal and external business and product requirements for data operations and activity and suggests changes and upgrades to systems and storage to accommodate ongoing needs.
Data Integration
- Strategically obtain and integrate data and information from various sources into the firm’s platforms, solutions and statistical models.
- Lead discussion with Data Scientists to understand the data requirements and create re-usable data assets to enable data scientists to build and deploy machine learning models faster.
- Design, build, and maintain optimized data pipelines and ETL solutions as business support tools in providing analysis and real time analytics platform for critical decision making.
- Ensure data assets are organized and stored in an efficient way so that information is high quality, reliable, flexible, and efficient.
Project Management
- Manage project conflicts, challenges and dynamic business requirements to keep operations running at high performance.
- Work with team leads to resolve people problems and project roadblocks, conduct post mortem and root cause analysis to help squads continuously improve their practices to ensure maximum productivity.
Talent Development
- Mentor and coach junior fellows into fully competent Data Engineers.
- Identify and encourage areas for growth and improvement within the team.
Yêu Cầu Công Việc
Qualifications
- Bachelor's or Master’s degree in Statistics, Mathematics, Quantitative Analysis, Computer Science, Software Engineering or Information Technology
- 5+ years of relevant experience with developing, debugging, scripting and employing big data technologies (e.g. Hadoop, Spark, Flink, Kafka, Arrow, Tableau), database technologies (e.g. SQL, NoSQL, Graph databases), and programming languages (e.g. Python, R, Scala, Java, Rust, Kotlin) with preference towards functional/trait oriented
- English proficiency requirements are pursuant to Techcombank's policy
- Deep experience in designing and building dimensional data models, ETL processes, applied data warehouse concepts and methodologies, optimized data pipelines and wore the architect hat in the past or worked with one extensively
- Deep experience with monitoring complex system and solving data and systems issues having a consistent and algorithmic approach to resolving them
- Deep understanding of Information Security principles to ensure compliant handling and management of all data
- Experience working in Agile teams to lead successful digital transformation projects, having mastered Agile principles, practices and Scrum methodologies
- Has the know-how and the scripting and coding prowess to set up, configure và maintain a machine learning model development environment
- Experience architecting, coding and delivering high performance micro services and/or recommenders delivering recommendations to (tens of) millions of users