BigData Developer with GCP

Sii Sp. z o.o.

Warszawa, Mokotów
Hybrydowa, full office
Apache Spark
📊 Big Data
BigQuery
ETL
🐍 Python
SQL
Terraform
🔍 Google Cloud Platform
Hybrydowa
full office

Requirements

Expected technologies

Apache Spark

Big Data

BigQuery

ETL

Python

SQL

Terraform

Google Cloud Platform

Optional technologies

Ansible

Our requirements

  • Minimum 5 years of hands-on experience in Big Data engineering projects
  • Solid experience with Google Cloud Platform (GCP), especially BigQuery, Dataflow, Dataproc, GCS, and Cloud Composer
  • Strong knowledge of ETL/ELT processes and distributed data processing (e.g., Apache Spark)
  • Proficiency in Python and SQL for data transformations and orchestration logic
  • Familiarity with Terraform and CI/CD best practices for cloud infrastructure and deployment
  • Hands-on experience with Airflow (DAG design, scheduling, monitoring)

Your responsibilities

  • Creating dynamic Airflow DAGs using a custom DAG Factory and deploying them via Cloud Composer
  • Developing scalable Cloud Dataflow apps for migrating data from Oracle on-prem and HDFS to BigQuery and GCS
  • Setting up Cloud Dataproc applications to orchestrate ingestion pipelines from various sources into BigQuery
  • Supervising the lifecycle and governance of data assets and data products using tools like BigQuery, Dataplex, and BigLake
  • Supporting daily operations: optimization, monitoring, incident response (cloud and on-prem)
  • Teaching data teams how to use and govern data products within GCP efficiently
Wyświetlenia: 4
Opublikowana10 dni temu
Wygasaza 3 dni
Tryb pracyHybrydowa, full office
Źródło
Logo
Logo

Podobne oferty, które mogą Cię zainteresować

Na podstawie "BigData Developer with GCP"