Apply Now

Data Engineer with Blockchain (Remote)

DCG

Warszawa, Warszawa
150 - 170 PLN
remote
b2b
PySpark
🤖 Blockchain
☁️ Azure Databricks
Delta Lake
Apache Spark
💼 b2b
🌐 remote
full_time

Job description

Responsibilities:

  1. Design, implement, and maintain scalable data pipelines using Azure Databricks, Spark, and PySpark
  2. Work with Delta Lake to manage large-scale data storage and optimize performance
  3. Develop robust data integration solutions using Azure Data Factory and Azure Functions
  4. Build and maintain structured and semi-structured data models, leveraging formats such as Parquet, Avro, and JSON
  5. Ensure efficient and secure data processing through proper performance tuning and code optimization
  6. Collaborate with development and analytics teams to support business data needs
  7. Apply version control best practices using Git and follow coding standards in Python and SQL

Requirements:

  1. Strong hands-on experience with Azure Databricks, Spark, and PySpark
  2. Proficiency in building and tuning data pipelines with Delta Lake
  3. Solid understanding of data modeling and performance optimization techniques
  4. Practical experience with Azure Data Factory, Azure Functions, and Git
  5. Competence in working with data formats such as Parquet, Avro, and JSON
  6. Strong programming skills in Python and SQL
  7. Ability to work effectively in a fast-paced, enterprise-level environment
  8. Strong communication skills and fluency in spoken and written English (C1)

Nice to have:

  1. Understanding of blockchain-related concepts and data structures

Offer:

  1. Private medical care
  2. Co-financing for the sports card
  3. Training & learning opportunities
  4. Constant support of dedicated consultant
  5. Employee referral program
Views: 19
Published18 days ago
Expiresin 12 days
Type of contractb2b
Work moderemote
Source
Logo

Similar jobs that may be of interest to you

Based on "Data Engineer with Blockchain"