Вид занятости: Полная занятость
Специализация: Программное обеспечение
Минимальное образование: Высшее
Минимальный опыт: 3 года
Рабочая виза: Шенген
Our client is looking for a Principal/Senior GCP Data Engineer for an ongoing, 100% remote contract.Role / Responsibilities Design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP). Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time. Design data models for optimal storage and retrieval to support sub-second latency. Deploy and monitor large database clusters that are performant and highly available. Incorporate solid understanding of cloud-based offerings in the space. Collaborate with product/application architects to develop holistic solutions. Implement operational procedures (logging, monitoring, alerting, etc.) for dependable running of pipelines/jobs.RequirementsSkills/Requirements Strong programming skills in Python and SQL Expertise in Cloud data stores (Example: GCP BigQuery, GCP BigTable, GCP FireStore, GCP CloudSQL, ELK) Data pipeline and workflow orchestration tools (e.g., Dataflow, Pentaho, Airflow, Azkaban) Data processing technologies (e.g., GCP BigQuery, Spark) Data messaging technologies (e.g., GCP PubSub, Kafka) Deployment and monitoring large database clusters in public cloud platforms (e.g., Docker, Terraform), Unix/Linux Shell scripting,