Set up as a new job alert
Set up as a new job alert
Enter your email below to receive alerts to your inbox when similar jobs become available.
By clicking "Yes, send me jobs" below you are consenting to receive jobs to your inbox, based on the search criteria you have selected, as per our privacy policy.
By clicking "Yes, send me jobs" below you are consenting to receive jobs to your inbox, based on the search criteria you have selected, as per our privacy policy.
Thank you for signing up for a Job Alert
15029en
- Permanent
- English
- Mid-Senior level
- Digital & Technology
- Bern, St. Gallen
- Healthcare & Life Sciences
Skills
SQL, Python, Data modeling, Azure, Azure Devops, Datalake
Click here to get a notification every time a job like this gets added
Great news! We will let you know when a new job like this has been added!
This vacancy has now expired
On behalf of one of our clients, a global company specialized in leading-edge tools, technologies, software and services in the pharma/ Health industry, Swisslinx is looking for a talented Data Engineer.
This is a permanent position and my client is in the pharma/healthcare space. Through this position you will collaborate on designing and developing an Azure Cloud data platform, creating models, structuring data, and developing integration workflows. In this culturally diverse company you will have the opportunity to collaborate and build relationships with people from different backgrounds and disciplines and work on complex and challenging projects that require creative problem-solving and innovative thinking.
Tasks and Responsibilities:
• Collaborate on the design and development of a new data platform in the Azure Cloud.
• Create data models based on requirements and use cases for data analytics.
• Organize and structure data sets using various modern cloud and big data technologies.
• Connect data sources and develop transformation and data integration workflows.
• Work closely in an agile environment, taking responsibility for specific areas such as EDW, Data Lake, or data architecture.
Qualifications:
• Bachelor's or Master's degree in Computer Science or a related field.
• Extensive experience with cloud technologies, preferably in the Azure Cloud (including Data Lake Storage, Data Factory, Event Hub, and Azure DevOps).
• Strong knowledge of data modelling, storage, and access techniques with experience in big data architectures and integration models.
• Proficient in programming languages such as Python, Java, or Scala, and SQL.
• Good working experience with decentralized systems and familiarity with tools like Apache Spark, Apache Flink, or Apache Beam.
• Proficiency English skills are required. German or French is a strong asset.
Are you looking for a company a new challenge where you have the opportunity to pursue your interests across functions and geographies and where a job title is not considered a final destination but the starting point? Then apply immediately! We look forward to receive your application!
This is a permanent position and my client is in the pharma/healthcare space. Through this position you will collaborate on designing and developing an Azure Cloud data platform, creating models, structuring data, and developing integration workflows. In this culturally diverse company you will have the opportunity to collaborate and build relationships with people from different backgrounds and disciplines and work on complex and challenging projects that require creative problem-solving and innovative thinking.
Tasks and Responsibilities:
• Collaborate on the design and development of a new data platform in the Azure Cloud.
• Create data models based on requirements and use cases for data analytics.
• Organize and structure data sets using various modern cloud and big data technologies.
• Connect data sources and develop transformation and data integration workflows.
• Work closely in an agile environment, taking responsibility for specific areas such as EDW, Data Lake, or data architecture.
Qualifications:
• Bachelor's or Master's degree in Computer Science or a related field.
• Extensive experience with cloud technologies, preferably in the Azure Cloud (including Data Lake Storage, Data Factory, Event Hub, and Azure DevOps).
• Strong knowledge of data modelling, storage, and access techniques with experience in big data architectures and integration models.
• Proficient in programming languages such as Python, Java, or Scala, and SQL.
• Good working experience with decentralized systems and familiarity with tools like Apache Spark, Apache Flink, or Apache Beam.
• Proficiency English skills are required. German or French is a strong asset.
Are you looking for a company a new challenge where you have the opportunity to pursue your interests across functions and geographies and where a job title is not considered a final destination but the starting point? Then apply immediately! We look forward to receive your application!
I manage this role