Big Data Engineer, Group Operations & Technology
The candidate will be responsible for working with a portfolio of internal users to gather user requirements, design, propose enhancement solution and delivered the approved solution which conform to the bank's technology architecture guidelines, technology information security standard and IT management process.
- Work closely with key stakeholders to gather requirements, design and propose solutions with follow-up to deliver application enhancements.
- Able to perform system impact assessment for given enhancement and ensure compliance to technology architecture standards, security policies and standard management processes of the organization.
- Conduct application testing, investigate and resolve technical issues reported during the testing and work with technology Information Security Officer to application vulnerability testing and closure.
• Expert skills in SQL, Scala and Phython.
• Good experience in Data Management and ETL.
• Experience with Cloudera Distribution of Hadoop System.
• Proficient understanding of distributed computing principles.
• Management of Hadoop cluster, with all included services.
• Added Advantage for having experience in Oracle Big Data Appliance and ODI.
• Proficiency with Hadoop v2, MapReduce, HDFS
• Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala
• Experience with integration of data from multiple data sources
• Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
• Knowledge of various ETL techniques and frameworks, such as Flume
• Experience with building stream-processing systems, using solutions such as Spark-Streaming, Kaftka, Storm or Kinesis
- Knowledge of Application development life cycle
- Understanding of Agile methodologies
- DevOps operating practices
- Knowledge of MAS technology risk management guidelines