Our client is a global business Information company. They offer a full range of services including Business Credit Report, Employment Screening Report and Individual Search for different business sectors and industries.
• Collaborate with technology teams in defining the technology strategy, solutions and architecture in data.
• Establish and maintain core data models, information standard and data management lifecycle controls.
• Work closely with business sponsors to identify opportunities, analyze, and interpret trends or patterns.
• Take ownership of analytical projects end to end from extracting and exploring data, generating hypothesis, building structured analysis, and evaluating results.
• Apply industry best practice to create scalable data platform that supports big data development, deployment, operation, and management.
• Establish and execute the mechanism of data governance. Develop data standard and specifications, and monitor the implementation of data standard and specifications during the implementation of relevant projects.
• Develop data quality standards and data quality control plans, conduct data quality management, tracking and monitoring, and establish data quality evaluation systems.
• Lead in delivering the architectural design of enterprise scales, end-to-end solution for major projects; guide projects in their technology and vendor selections.
• Ensure business alignment by translating business strategy and direction into technical vision, strategy and roadmap.
• Lead the resolution of technically complex, cross-system or long term issues that impact business operation.
• Lead the development of the IT architecture and facilitate its adoption; review technology selection and system designs for compliance with established architecture.
• An university degree in Engineering, Computer Science, or related disciplines
• Have 10 plus years of experience in technology with at least 5 years of architectural responsibilities (application or technical architecture)
• Have current and strong development skills enabling the ability to bring architecture definition to actual implementation
• Have excellent knowledge and experience of SCRUM
• Have good stakeholder management, influencing, and communication skills at all levels in the organization
• The candidate is expected to have experience in the architecting of large scale data lake system. Ideally the candidate will have experience with the following platforms, tools and processes:
- Data Science Languages / tools: R, Python, Tensorflow, Keras
- Visual Analytic Tools: SAS, Tableau, PowerBI, Metabase
- ETL Tools : IBM Datastage, Informatica, Talend
- Data Warehouse technology: Oracle Exadata, SQL Server, Teradata, Redshift, IBM Data Warehouse
- Database technology : Mysql, Postgresql, Mongodb, Oracle DB, IBM DB2, Microsoft SQL Server
- New Data Lake technology: Hadoop, Spark, Kafka, Hive, MapReduce, Data Lake Analytics, Athena
- Messaging technology: IBM MQ, Tibco, RabbitMQ
- Integration Platform : Mulesoft, Snaplogic
- Cloud Technology : Aliyun, AWS
- Container Technology : Kubernetes, Docker, Swarm
- Serverless Technology : Alibaba Function Compute, AWS Lambda, Kubeless
• The custom software is being built using iterative development processes leveraging DevOps principles at scale to achieve Continuous Integration (CI) and continuous delivery (CD). The DevOps tools being used for deployment include Bit Bucket; Jenkins; Urban Code; Jira and associated add-ons and tools
• A proactive and can-do attitude. Excellent communication skills in English, Cantonese, and Mandarin.