Preferences

Privacy is important to us, so you have the option of disabling certain types of storage that may not be necessary for the basic functioning of the website. Blocking categories may impact your experience on the website. More information

Accept all cookiesClose button

These items are required to enable basic website functionality.

Always active

These items are used to deliver advertising that is more relevant to you and your interests.

These items allow the website to remember choices you make (such as your user name, language, or the region you are in) and provide enhanced, more personal features.

These items help the website operator understand how its website performs, how visitors interact with the site, and whether there may be technical issues.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Cookies

Databricks Developer

Hybrid
Full-time
Contractor
up to 7000 CZK/MD
Development

Overview

This is a project focused on the design and implementation of an end-to-end ETL process and secure data migration via a proxy server to an SAP environment hosted in Canada. We are looking for an experienced Data Engineer who will be a key person in the design of the integration architecture and the implementation of scalable data pipelines using Snowflake and Databricks.

Mission

  • Design and implementation of end-to-end ETL processes
  • Data migration via secure proxy infrastructure to SAP environment in Canada
  • Development and orchestration of data pipelines in Databricks and Snowflake
  • Programming of transformation logic in Python
  • Implementation of data quality checks, validation and sanity check mechanisms
  • Logging, monitoring and auditability of data transfers
  • Performance optimization

Skills

  • At least 2 years of experience in Databricks
  • Demonstrable experience with ETL pipeline design and development
  • Strong experience with Databricks and practical knowledge of Snowflake
  • Advanced knowledge of Python and CI/CD
  • Experience with data migration to SAP or integration of enterprise systems
  • Knowledge of security principles in data transfer
  • Orientation to data quality, auditability and stable production operation
  • Complex ETL pipeline (across multiple tables with various dependencies) and query optimization (in terms of performance and cost)
  • Experience with reading Delta Share tables
  • Experience with Enterprise GitHub Actions, JFrog an advantage

Benefits

  • Great colleagues and fully flexible work policy
  • Career coaching and development
  • Flexible working hours
  • Technical training and workshops
  • Technical equipment for work (Mac / Windows)
  • Company parties
  • Company psychologist for mental well-being
  • Multisport card
Apply for this job