Data Engineer

Set up as a new job alert
13454en
  1. Contract
  2. English
  3. Associate
  4. Digital & Technology
  5. Zurich
  6. Financial Services

Skills

Data Engineer, Python, Banking, Big Data

Click here to get a notification every time a job like this gets added

Great news! We will let you know when a new job like this has been added!

This vacancy has now expired
As the premier supplier to our client, one of the most established financial institutions worldwide, Swisslinx are looking for an experienced Data Engineer with strong knowledge of Python and libraries such as PySpark and Pandas to support a global compliance project.

This is a rolling 6 month contract starting ASAP, with the intention to extend (up to two years) or internalize, based in Zurich.

In this role you will be part of a global team responsible for maintenance and enhancement of a data lake platform, which organizes data collected from various front office businesses and prepares it for processing by our Compliance and Risk projects.

The role sits within the Data Tooling team of five engineers, which is responsible for developing and maintaining in-house developer data integration and management application.

Responsibilities will include:

• Working with a global team, as part of building our data platform and enhancing the number of systems that we have data connectivity to working across multiple IT and Business teams.
• Owning and driving the delivery of solutions for processing, integrating and managing data on Data Lake.
• Capturing and implementing requirements from various execution and governance teams.

As the ideal candidate for this position, you will possess the following skills:

• 5+ years of experience with Python (PySpark, pandas)
• 5+ years Big Data knowledge (databases such as Spark, Hive, Impala) – 5-7 years of experience
• Able to understand database concepts especially composite primary and surrogate keys (eg. Oracle or MySql or any relational databases)
• Organized and good at switching contexts or tasks
• Fluency in English, both written and verbal

The following skills are nice to have:

• Knowledge of any Java/Scala, GraphQL, is a strong plus
• ETL experience and building complex pipelines
• 5-10 years of experience and knowledge of Big data technologies and data lakes and how that differs from traditional database systems
• Understanding of Data Management, Data-as-Service, Data Lake/Mesh, Kappa / Lambda architecture is a strong plus

Please note due to COVID interviews would be conducted remotely, however the role is based 100% in Switzerland. The team is currently split between WFH and office right now. Post COVID occasional home office will be permitted 2-3 days per week.

Are you interested in taking the next step in your career as a storage professional as part of a business critical team with fantastic career prospects for the long-term? If so then please send your application ASAP!

How to Build Networking Skills in 2024

READ MORE

How do I start a career in commodities?

READ MORE