This assignment expired 9 months ago
We are looking for MLOps Engineer to join Risk & Compliance Data Science department at Danske Bank.
About the company:
Danske Bank is a Nordic bank with strong local roots and bridges to the rest of the world. For more than 145 years, they have helped people and businesses in the Nordics realise their ambitions. They want to help their customers become financially confident and help them build their lives and businesses on a solid financial foundation. They aim to create long-term value for all our stakeholders – their customers, shareholders and the societies they are part of – and our vision is to be recognised as the most trusted financial partner.
You will become a part of the Data Engineering CoE working in the Risk & Compliance Advanced Analytics department. The focus will be to build data pipelines for the Risk & Compliance models developed by the Data Scientists. The purpose of Risk & Compliance Data science initiative is to use Data Science to take Compliance and Risk Management to the next level. The project is seeking a contractor to speed up the deliveries for developed solutions in various business segments.
Contractors will be part of the Risk & Compliance department working in multiple countries. The main responsibility is to
1. - Help deploy and operate prioritised models to PROD
2. - Educate CoE in MLOps best practices and help draw up internal guidelines
3. - Assist in pipeline development tailoring to Risk & Compliance CoE needs together with the infrastructure department:
a. CI/CD pipelines
All this to ensure the Risk & Compliance department reaches it goals in 2021.You main responsibility will be to build the pipelines for the statistical models utilizing the existing tech stack as well as bringing your experience into play to leverage new technologies. All tasks will be solved as part of multifunctional DevOps squads consisting of broad capabilities delivering end to end Data Science solutions. You will therefore be working closely with experienced Data Scientists and Machine Learning Operations Engineers as well as the business units who are consuming the data science services.
- Version control (Git/Bitbucket)
- CI/CD tools (Azure DevOps, Jenkins, or similar)
- Code analysis and unit testing toolkit (e.g., PyLint & PyTest, SonarQube)
- Model registry (e.g., Artifactory, MLflow)
- Workflow orchestration using Airflow or similar
- Monitoring (e.g., Prometheus, Grafana, AppDynamics)
- Experience with Public Cloud (Either Azure/AWS/GCP)
- Container technologies (Docker/Kubernetes/OpenShift)
- Strong Software Engineering Background
- Linux/Bash IT Consultancy Request FORM
- Strong Python knowledge (incl PySpark)