Software Scientist (Data Analysis and Visualization Services)
Your tasks You will join the Experiment IT Development and Operations Group in the new AWI department to play a central role in a collaboration with the Photon Science Division aiming at improving and developing data analysis and visualization solutions . As part of the SLS2.0 upgrade, exciting efforts are underway at this internationally recognised facility to push forward the forefront of both scientific discoveries as well as data science, computing research and development.
Your main tasks will include:
- Design and develop data analysis and visualization solutions within a heterogeneous system to support efficient operation. These tools will help steer running experiments on a beamline, monitor system status across software, hardware and infrastructure
- Design and develop data analysis and visualization solutions to assist beamline staff with further algorithm development and benchmark testing. These tools will include supporting workflow integration, monitoring, and improvement, job submission and tracking using on-premises and cloud-based HPC clusters
- Identify and assess data analysis needs and requirements for SLS 2.0. Evaluate current best practice and performance optimisation on existing SLS beamlines and related scientific domains
- Contribute to ongoing efforts such as the overarching Controls and Science IT services and infrastructure for SLS 2.0
- Collaborate with colleagues with similar functions at the PSI facilities along with leading national and international institutions and consortia
- Experience in development and deployment of modern GUI and visualization tools for improved usability and user experience (e.g. using desktop and/or Web frontend frameworks)
- Prior experience in the analysis and reduction of large-scale experiment data, and/or image processing
- Experience of managing and organizing the parameters and results from diverse data sources
- Experience with job schedulers (e.g. SLURM) and message-queueing middleware for workflows
- Experience in integrating and deploying workflows and software on HPC clusters
- PhD degree (or equivalent practical experience) in computer science, data science or natural science
- You are a good listener and a strong team player with excellent communication skills and sense of responsibility, fluent in English (spoken and written), speak German (an advantage not a must) or willing to learn it
To apply for this job please visit www.psi.ch.