Crush your SAP career goals.
Get the guidance, resources, and connections you need to secure your next role
Data Platform Operations Engineer - Databricks/Airflow
Jobs post
- Netherlands, Western Europe Location
- EUR450 - EUR500 per day Salary
- Data & Analytics Technology
- Contract Type
- Remote Workplace
Data Platform Operations Engineer - Databricks/Airflow / Remote EU / 6 months + / Start ASAP
RESPONSIBILITIES / TASKS:
As a Data Engineer in Data Platform, you will be supporting 1000+ stakeholders focusing on data transformation and pipeline orchestration technologies. By constantly challenging the status quo, you will ensure stakeholder teams to deliver high-performance data processing solutions that are efficient and remain reliable at scale
Where your experience is needed:
* Automation of support activities (e.g cluster creations, configuration changes)
* Improving documentation to address user inquiries
* Prioritising customer queries and escalating serious technical issues to 2nd level
* Operational Excellence
* Stakeholder support through chat and support tickets
* Troubleshooting user issues related to Databricks (Spark) / Airflow and suggesting optimisation's for long running or resource intensive jobs
* Guiding users on best practices of Databricks and Airflow cluster management
* Consult and help our diverse teams to develop, implement and maintain sustainable, high-performance, growth-ready data-processing and data integration systems
* Having documentation focus to improve user journey
SKILL SET - Technical:
Mandatory:
* Basic understanding of distributed data processing frameworks like Apache Spark
* Understanding of relational database management systems
* Hands-on experience with Databricks
* Hands-on experience with Apache Airflow/Amazon Managed Workflows for Apache Airflow
* Hands-on experience with Python combined with SQL knowledge
* Hands-on experience in cloud technologies (AWS services like S3, EC2, IAM, Cloudformation)
* Hands-on experience in container/orchestration tools (Kubernetes, Docker, Podman)
* Engineering craftsmanship with an experience in software development processes focusing on testing, continuous integration/continuous delivery (CI/CD), monitoring and writing documentation
Nice to have:
* Knowledge on AWS Step Functions, Cloudwatch, Lambda
Must have skills:
* Amazon Web Services
* Apache Airflow
* Databricks
* Python

Reference
CR/111482_1699435824
You might also like…
Related Jobs
Register with RED Global.
Register
