Job Summary
PBT Group is looking for a GCP Data Engineer to join a high-performing data engineering capability focused on delivering scalable, modern cloud-based data solutions.
This opportunity is ideal for a hands-on Data Engineer with strong Google Cloud Platform (GCP) experience who enjoys building and optimising data pipelines, integrating complex data sources, and enabling data-driven decision-making through robust engineering practices.
The successful candidate will work alongside Data Engineers, Architects, Analysts, and business stakeholders within Agile delivery environments to deliver high-quality enterprise data solutions.
Key Responsibilities
Data Engineering & Pipeline Development
- Design, develop, and maintain scalable ETL/ELT data pipelines on Google Cloud Platform.
- Build and optimise data ingestion, transformation, and orchestration processes.
- Develop solutions using GCP technologies such as:
- BigQuery
- Dataflow
- Pub/Sub
- Cloud Storage
- Cloud Composer
- Integrate structured and unstructured data from multiple source systems.
Data Modelling & Transformation
- Support the development of modern cloud data platforms and data warehouses.
- Implement data transformation logic and business rules.
- Ensure data quality, consistency, reliability, and governance standards are maintained.
- Optimise query performance and large-scale data processing workloads.
Collaboration & Delivery
- Work closely with Data Analysts, BI teams, and business stakeholders to understand requirements and deliver fit-for-purpose solutions.
- Participate in Agile ceremonies and contribute toward sprint delivery commitments.
- Assist with troubleshooting, debugging, and resolving data-related issues.
- Contribute to continuous improvement initiatives and engineering best practices.
Automation & DevOps
- Support CI/CD practices and deployment automation.
- Assist with monitoring, logging, and performance optimisation of data pipelines.
- Contribute toward reusable frameworks, templates, and engineering standards.
Required Skills & Experience
Essential
- 3–5 years’ experience in Data Engineering or related roles.
- Hands-on experience with Google Cloud Platform (GCP).
- Strong experience with:
- Experience building ETL/ELT pipelines.
- Understanding of modern data warehousing concepts and architectures.
- Experience working with large datasets and data transformation processes.
- Strong analytical and problem-solving capability.
- Experience working in Agile delivery environments.
Advantageous
- Exposure to Dataflow, Pub/Sub, Cloud Composer, or Dataproc.
- Experience with dbt or similar transformation frameworks.
- CI/CD and DevOps exposure.
- Financial services or enterprise consulting experience.
- Exposure to data governance and data quality practices.
Ideal Candidate Profile
We are looking for someone who:
- Is technically strong and delivery-focused.
- Enjoys solving complex data problems.
- Can work independently while collaborating effectively within teams.
- Has a passion for cloud technologies and modern data engineering practices.
- Is eager to continue growing within enterprise-scale cloud data environments.
Key Competencies
- Strong communication and stakeholder engagement skills.
- Attention to detail and quality-focused mindset.
- Adaptability and willingness to learn.
- Strong troubleshooting and analytical ability.
- Team collaboration and delivery ownership.
* In order to comply with the POPI Act, for future career opportunities, we require your permission to maintain your personal details on our database. By completing and returning this form you give PBT your consent
* If you have not received any feedback after 2 weeks, please consider you application as unsuccessful.