DevOps

SHARE THIS POSITION

Share on linkedin
LinkedIn
Share on facebook
Facebook
Share on email
Email

Details

Main Tasks & Responsibilities:

• Consulting the data scientists in the development of algorithms and machine learning models for search result personalization;

• Implementation of batch data pipelines for the preparation, provisioning and versioning of training data for the machine learning models;

• Implementation of real-time data pipelines for the provisioning of data to create and update a search index;

• Implementation of microservices providing REST APIs, including tracing and monitoring;

• Deployment of the microservices in the productive cloud environment, considering scalability and high availability requirements;

• Implementation of an MLOps process for the continuous improvement of the used models in production.

Main Technical Requirements:

• Minimum of 3 years of experience;

• Experience with Google Cloud Platform, Terraform, Kubernetes, Docker, Argo CD/Workflows;

• Experience with Google Dataflow, Apache Beam; 

• Experience with BigQuery, BigTable;

• Knowledge of Python, Java, REST APIs, GraphQL.

Other Requirements:

• EU citizenship;

• Ability to work well under pressure and tight timeframes;

• Good work ethic and high levels of motivation;

• Fluent in written and spoken English.

Apply Now

Interested in this job?

Don't have time?

Just enter your email, and we’ll send you this job description.