Zurich, Hybrid
12 months
41.25 hours
SL-001038
SL-001038
About the client
Top 3 must-have skills
- PySpark (Python alone is not enough — Spark is essential)
- Databricks & Delta Lake
- Azure fundamentals
Job description
- Design and implement production-ready data pipelines using PySpark on Databricks
- Work from specifications provided by Business Analysts
- Collaborate closely with Data Engineers and Solution Architects
- Ensure solutions meet both functional and non-functional requirements
- Developing applications using an IDE such as VS Code or PyCharm
Requirements
- Bachelor’s or Master’s degree in Computer Science (or related field)
- 5+ years of experience developing complex software systems
- Strong software engineering mindset (design patterns, best practices, clean code)
- Solid background in data engineering and analytical frameworks
- Strong core Python software engineering experience
- 3+ years of hands-on experience with modern data platforms
- Experience of working with delta lakes and optimizing Spark workloads running on it.
- Have a good understanding and significant experience with projects using relational data models and SQL.

