This website uses cookies. Cookies not only ensure better ease of use, but also give us the opportunity to get to know you better. If you accept the recommended cookies, we can optimize your browsing experience thanks to the information you share with us. The choice is completely yours.
Privacy is important to us, so you have the option of disabling certain types of storage that may not be necessary for the basic functioning of the website. Blocking categories may impact your experience on the website. More information
We build end-to-end data solutions that connect the worlds of data lake, lakehouse, DWH and BI/analytics. We are not looking for a pure theorist, but a leader who will design the architecture, build it and deliver it to production.
Mission
End-to-end Delivery: Design and implementation of modern lakehouse + DWH solutions (from ingestion to transformation to the BI layer).
Engineering: Building reliable data pipelines in Python and Spark (batch and streaming) including CDC and orchestration.
Hands-on Tech: Active work with Azure stack – Databricks, Microsoft Fabric, ADLS, Synapse and Power BI.
Standardization: Setting engineering standards: CI/CD for data, observability, data quality and security (DevSecOps mindset).
Leadership: Leading the delivery, mentoring the team and working directly with stakeholders to transform ideas into reality.
Skills
Strong production experience with delivery lakehouse/DWH (not just PoC).
Excellent knowledge of Python and Spark.
Experience with end-to-end management (scoping, design, rollout).
Experience with Azure data stack and Databricks.
Advantage
Experience with Microsoft Fabric.
Knowledge of Terraform or other IaC tools for data infrastructure.