Our client is a fintech company that focused on payment and customer loyalty programs. In order to enhance the technical capacity, they are currently looking for a Data developer specialized in big data and data warehouse.
The role will have the responsibility included, but not limited to:
- Design and deploy the big data applications within a streaming architecture to ingest, transform and store data in the Machine Learning Models
- Participate in prototype development in a professional and timely manner
- Provided ETL applications and scripts across a range of data sources and storages
- Support for the data processing and reporting.
- Provided data warehouse (designed with Ralph Kimball's Dimensional Modelling technique)
- Managed multiple petabyte-scale clusters, easy-to-use systems to handle security, disaster recovery, and replication are in development
Our ideal candidates are the ones who meet these requirements:
- Have at least 2 years of relevant experience.
- Have a Bachelor's Degree in Computer Science, Software Engineering or related fields.
- Experience with integration of data from multiple data sources.
- Experience with the object-oriented language (Python, Java)
- Experience with SQL, NoSQL, Oracle, MongoDB,...
- Experience building and optimizing 'big data' data pipelines, architectures and data sets.
- Experience with Big data tools: Hadoop, Spark, Kafka, etc. is plus
- Experience with Cloud Services (AWS, Google Cloud etc) is plus
Attractive career development plan and package, as well as a professional working environment, is reserved to the right candidate.
You are invited to send us a full detailed resume in English together with daytime contact numbers as soon as possible. Applications will be treated in total confidentiality.