Set up as a new job alert
Set up as a new job alert
Enter your email below to receive alerts to your inbox when similar jobs become available.
By clicking "Yes, send me jobs" below you are consenting to receive jobs to your inbox, based on the search criteria you have selected, as per our privacy policy.
By clicking "Yes, send me jobs" below you are consenting to receive jobs to your inbox, based on the search criteria you have selected, as per our privacy policy.
Thank you for signing up for a Job Alert
16112en
- Contract
- English
- Mid-Senior level
- Digital & Technology
- Basel
- Healthcare & Life Sciences
Skills
Data Engineering, Data Integration, Warehouses, Data lakes, database modelling, Python, SQL, Automated extraction
Click here to get a notification every time a job like this gets added
Great news! We will let you know when a new job like this has been added!
This vacancy has now expired
On the behalf of our client Roche (Pharmaceutical Research and Early Development organization) we are looking for a highly motivated Data Engineer focusing on developing and maintaining data workflows.
Profile:
The perfect candidate has 4 years of experience in data and workflow engineering.
Additionally, the person is experienced in data integration using warehouses, data lakes and database modelling. Experience with Python and SQL is required. Lastly, the candidate communicates fluently in English.
• Love working in an environment with high urgency, and you love writing code
• Enjoying operational side of things, like managing cloud-based infrastructure, workflow orchestration (e.g. GitLab CI/CD) and containerization (e.g., Docker)
• Interest in understanding biomedical or preclinical research / worked in Phama
• Excellent communication skills in English
Main responsibilities:
• Developing a robust light-weight framework for creating data products based on an existing prototype that extracts data from multiple sources and creates data visualizations as the output
• Creating automated extraction and loading processes from key databases
• Creating easy-to-maintain transformations using DBT
• Enabling a data storage solution and consumption by users via API or GUI
• Developing, maintaining and documenting a solid codebase that we can turn into a self-service framework
Must-Have:
• 3-6 years in Data and Workflow engineering skills ETL/ELT principles
• Experience with Data Integration using warehouses, Data lakes and database modelling (Snowflake, Postgres etc.)
• Experience with Python for data extraction scripts
• Experience with SQL for data extraction from internal databases
• Experience with PostgreSQL for persistence of final product data, AWS S3 as a staging layer, Gitlab for CI, task and code tracking
Key elements:
Ideal start date: 15.08.2024 – 6 months contract (extension possible) – Basel
Profile:
The perfect candidate has 4 years of experience in data and workflow engineering.
Additionally, the person is experienced in data integration using warehouses, data lakes and database modelling. Experience with Python and SQL is required. Lastly, the candidate communicates fluently in English.
• Love working in an environment with high urgency, and you love writing code
• Enjoying operational side of things, like managing cloud-based infrastructure, workflow orchestration (e.g. GitLab CI/CD) and containerization (e.g., Docker)
• Interest in understanding biomedical or preclinical research / worked in Phama
• Excellent communication skills in English
Main responsibilities:
• Developing a robust light-weight framework for creating data products based on an existing prototype that extracts data from multiple sources and creates data visualizations as the output
• Creating automated extraction and loading processes from key databases
• Creating easy-to-maintain transformations using DBT
• Enabling a data storage solution and consumption by users via API or GUI
• Developing, maintaining and documenting a solid codebase that we can turn into a self-service framework
Must-Have:
• 3-6 years in Data and Workflow engineering skills ETL/ELT principles
• Experience with Data Integration using warehouses, Data lakes and database modelling (Snowflake, Postgres etc.)
• Experience with Python for data extraction scripts
• Experience with SQL for data extraction from internal databases
• Experience with PostgreSQL for persistence of final product data, AWS S3 as a staging layer, Gitlab for CI, task and code tracking
Key elements:
Ideal start date: 15.08.2024 – 6 months contract (extension possible) – Basel
I manage this role