GCP Data Engineer | Dataflow & Python
UST Global Inc
We are seeking a highly skilled and experienced GCP Data Engineer to join our team. The ideal candidate will have strong expertise in implementing production-level solutions using Google Cloud Platform (GCP), particularly in BigQuery, Dataflow, and orchestration tools like Cloud Composer. This role requires a strong background in Python, SQL, and cloud-native data engineering practices.
Key Responsibilities:
Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services. Develop and manage BigQuery solutions, including partitioning, clustering, and managing slot allocations for optimal performance. Build scalable and efficient data pipelines using Dataflow and Dataform. Leverage Cloud Composer for workflow orchestration and ensure seamless integration with GCP services. Collaborate with cross-functional teams to define and implement data solutions that align with business objectives. Continuously monitor, optimize, and troubleshoot data pipelines for performance and reliability.Required Skills and Experience:
Strong experience in GCP Data Engineering, including hands-on experience with BigQuery, Dataflow, and Cloud Composer. Expertise in Python for developing and automating data workflows. Proficient in SQL for querying and manipulating large datasets. Solid understanding of BigQuery features such as partitioning, clustering, and slot management. Experience in implementing production-level data solutions, ensuring scalability and performance.Good to Have:
Experience with Data Modeling techniques. Ability to scale and optimize data pipelines for large volumes of data. Knowledge of Monitoring and Observability tools and best practices for cloud data engineering.
Confirm your E-mail: Send Email
All Jobs from UST Global Inc