This is a rolling 12 month contract starting in September, with a strong chance of extension (up to two years) or even internalization.
Joining an international team of 50+ engineers you would join a program tasked to develop a Big Data Analytics Platform. The project aims to build a scalable platform for processing data, enriching it further per business requirements, and serving the data to business users and other IT applications.
The challenges are mostly of technical and implementation nature. This includes provide robust solutions for the volumes of data the software is expected to processes, contributing to re-architecture of software landscape in the division and developing appropriate software solutions despite challenging timeframes.
Working with multiple teams, you will be responsible for the following:
• Develop software components responsible for sourcing intraday and end-of-day data and extracting analytical insights from this data
• Actively participate in the design and technology review of the software components developed in the team
• Evolve overall architecture of the solution with the use of latest technologies available in the bank
• Work to streamline development process and to improve software performance
• Contribute to integration testing (automated and manual) efforts as required
• Collaborate with platform management and other team members on the requirements, preparing the releases and delivering the applications to production
• Assist to resolve incidents involving Production system (3rd level support)
As the ideal candidate, you are a senior software engineer who is capable to pick up all tasks related to the entire lifecycle of product development and you will possess the following skills:
• Expert-level skills in software development using Scala and Java languages (including experience working as architect or technical lead)
• Experience in building data-intensive IT solutions with the use of No-SQL & Big Data databases as well as relational databases
• Experience in building distributed data processing pipelines using Apache Flink and/or Apache Spark (ideally both): this should include working with data streaming technologies such as Apache Kafka / Kinesis
• Experience with major Hadoop distributions (Cloudera, Hortonworks) and Cloud-based technologies
• Ability to shape the software architecture and experience in successfully integrating emerging open-source technologies
• Familiarity with Test Driven Development (TDD)
• Excellent English communication skills
The following skills would be nice to have:
• Expertise in Python programming
• Experience with Apache Druid
• Elastic technology stack (Elastic / Logstash / Kibana)
• Experience working with agile methodology and basic project management skills
Please note due to COVID interviews would be conducted remotely, however the role is based 100% in Switzerland.
Are you interested to work in an international environment in one of the leading and most reputable financial institutions worldwide? We look forward to receiving your application.