Set up as a new job alert
Set up as a new job alert
Enter your email below to receive alerts to your inbox when similar jobs become available.
By clicking "Yes, send me jobs" below you are consenting to receive jobs to your inbox, based on the search criteria you have selected, as per our privacy policy.
By clicking "Yes, send me jobs" below you are consenting to receive jobs to your inbox, based on the search criteria you have selected, as per our privacy policy.
Thank you for signing up for a Job Alert
11644en
- Contract
- English
- Mid-Senior level
- Digital & Technology
- Zurich
- Digital & Technology
Skills
big data, kafka, hadoop, cloudera, data modelling, sql
Click here to get a notification every time a job like this gets added
Great news! We will let you know when a new job like this has been added!
This vacancy has now expired
Our client is looking for an experienced Big Data Engineer to join their team for a six month contract with a high chance of extension. If you are currently looking for a new opportunity please refer to the required skills and responsibilities below and click apply if this is the right role for you!
Responsibilities:
• Develop software components responsible for ingesting end-of-day and intraday data and also for data flows involving joins between these data sets
• Actively participate in the design and technology review of the software components developed in the team
• Evolve overall architecture of the solution with the use of latest technologies available in the bank
• Work to streamline development process and to improve software performance
• Contribute to integration testing (automated and manual) efforts as required
• Collaborate with platform management and other team members on the requirements, preparing the releases and delivering the applications to production
• Assist to resolve incidents involving Production system (3rd level support)
Skills required:
• You have experience building data warehousing and analytics solutions using one of the major Hadoop distributions and various ecosystem components (e.g. HDFS, Impala, Spark, Flink, Flume, Kafka, etc.).
• 4+ years of experience in Python and/or Scala programming languages
• Experience with Data modelling and SQL query language
• Experience in building Production data pipelines using Spark, Spark Streaming and Flink technologies
• Experience with Security in Hadoop environment
• Bash scripting experience
Desired skills:
• Practical experience with one of the following Big Data platforms: Cloudera or Hortonworks (min. 1 years)
• Experience with Elastic technology stack (Elastic / Logstash / Kibana)
• Experience working with agile methodology and some basic project management skills
Due to the high number of applicants we will only contact the applicants who are most suitable for the position.
Responsibilities:
• Develop software components responsible for ingesting end-of-day and intraday data and also for data flows involving joins between these data sets
• Actively participate in the design and technology review of the software components developed in the team
• Evolve overall architecture of the solution with the use of latest technologies available in the bank
• Work to streamline development process and to improve software performance
• Contribute to integration testing (automated and manual) efforts as required
• Collaborate with platform management and other team members on the requirements, preparing the releases and delivering the applications to production
• Assist to resolve incidents involving Production system (3rd level support)
Skills required:
• You have experience building data warehousing and analytics solutions using one of the major Hadoop distributions and various ecosystem components (e.g. HDFS, Impala, Spark, Flink, Flume, Kafka, etc.).
• 4+ years of experience in Python and/or Scala programming languages
• Experience with Data modelling and SQL query language
• Experience in building Production data pipelines using Spark, Spark Streaming and Flink technologies
• Experience with Security in Hadoop environment
• Bash scripting experience
Desired skills:
• Practical experience with one of the following Big Data platforms: Cloudera or Hortonworks (min. 1 years)
• Experience with Elastic technology stack (Elastic / Logstash / Kibana)
• Experience working with agile methodology and some basic project management skills
Due to the high number of applicants we will only contact the applicants who are most suitable for the position.