
Devapo
Data Engineer (Azure, Databricks) – projektowanie i utrzymanie potoków danych, ETL, współpraca z zespołami BI/DS, monitorowanie wydajności, wdrażanie zabezpieczeń. Wymagane ≥3 lat doświadczenia, Azure, Databricks, Python/Scala/Java, SQL.
We are looking for an experienced Data Engineer responsible for planning, developing, and maintaining cloud environments for our clients. About DevapoAt Devapo, we focus on continuous self-development and acquiring new knowledge. If you are a fast learner, want to participate in international projects, are a team player, and can work independently — join us!We provide our clients with more than just code — we want to equip them with tools that allow their businesses to flourish. Our clients’ success is our success, which is why we ensure that everyone who creates Devapo has a long-term goal in mind.Key Responsibilities Design, implement, and maintain scalable and efficient data pipelines in Azure and Databricks Develop and optimize ETL processes using Azure Data Factory and Databricks Build Big Data solutions aligned with business requirements Collaborate with Data Science, BI, and development teams to ensure high-quality data Monitor performance of data processing systems and implement improvements Implement data security standards and practices Research and evaluate new technologies in the data engineering space Requirements Minimum 3 years of experience in a Data Engineer role or similar position Strong knowledge of Azure data services (Azure Data Factory, Azure Synapse Analytics, Azure Data Lake) Extensive experience with Databricks and Apache Spark Proficiency in SQL and experience with relational databases Experience in creating and optimizing data pipelines (ETL/ELT) in cloud environments Strong programming skills in Python, Scala, or Java Experience with Delta Lake architecture and optimization techniques Knowledge of data modeling and schema design Nice to have: Experience with Azure DevOps and CI/CD pipelines for data solutions Knowledge of Databricks Delta Live Tables and Databricks workflows Experience with real-time data processing using Azure Event Hubs or Kafka Familiarity with Azure Purview or other data governance tools Databricks or Azure certifications Experience with Power BI or other visualization tools Knowledge of infrastructure as code (Terraform, ARM templates) What We Offer:Salary: 11000 – 16000 PLN + VAT (B2B contract)Co-financing for training and certifications, as well as guaranteed time for learning during working hoursPrivate medical care and a Multisport cardLanguage classes (English)Team integration meetings and company eventsEmployee referral program with a bonusAn individually tailored career development path
Zaloguj się, aby zobaczyć pełny opis oferty
| Opublikowana | 26 dni temu |
| Wygasa | za około 2 miesiące |
| Rodzaj umowy | B2B |
| Źródło |
Nie znaleziono ofert, spróbuj zmienić kryteria wyszukiwania.