Ref: JSFRG Data Engineer

Cloud Data Engineer - GCP House - Fully Remote Working

England, London

  • 70000 to 90000 GBP
  • Engineer Role
  • Skills: GCP, Google Cloud, AWS, Azure, Data Engineering, ETL/ELT Pipelines, Data Lakes, DWH
  • Level: Mid-level

Job description

Cloud Data Engineer - GCP House - Fully Remote Working

JSFRG Data Engineer



A Fantastic oppirtunity for a experienced Cloud Data Engineers to join one of the UK's largest Google Cloud Parters, work on the best and most challenging GCP projects and be rewarded with Google Cloud accredations and be mainly remote working.



Role Overview:

Are you passionate about data? Would you be excited about building cloud first data platforms, potentially processing and storing terabytes of data, enabling businesses to make better decisions? Are you a quick learner and into working with cutting-edge technologies? If so, let's talk!

We are looking for an experienced Data Engineer to design, implement and maintain modern cloud data platforms at scale utilising both Google Cloud Platform and open source services. This is an opportunity to work with a world leader in data and machine learning technologies.

You'll work alongside a great GCP Engineering team (Data Engineers, Platform Engineers, Cloud Architects, Software Engineers), taking full ownership of your tasks and responsibilities, with opportunities to do back-end, front-end and infrastructure development.

If you have an interest in working on innovative projects with cutting edge technology using Google Cloud Platform, this job could be a great opportunity for you.

What you'll do:
  • Work with customers to translate requirements into technical designs and follow this through to delivery of the solution. You will be able to design highly available, complex data platforms, and mentor others to be able to do so
  • Advise customers on modern approaches to data engineering and the best approach to solve their unique problems
  • Establish yourself as a trusted advisor and reliable point of contact for customers
  • Understand customer data structures and transformation requirements to build suitable solutions on Google Cloud Platform
  • Design, plan and build highly available, global and cloud native data platforms
  • Assist customers in understanding how cloud technology can help them grow and the benefits it offers
  • Take a lead on technical projects, i.e be the technical point of escalation for more junior Engineers and engage regularly with the customer during the delivery of the solution.
  • Develop internal tooling and improve processes for the wider team and business.
  • Create engaging technical content to promote CTS and Google Cloud products and services (e.g. case studies and thought pieces)
  • You'll get involved in our drive towards social progress where possible. Whether that's getting involved in one of our community projects or simply buying local when travelling with the company.


Key Skills:
  • A minimum of five years commercial experience in the area of data engineering or machine learning
  • Confident communicator both written and verbally
  • Extensive hands-on experience with at least one major cloud provider (GCP, AWS, Azure), including the use of their data related components to build solutions.
  • Experience designing and building large data architectures
  • Strong experience building ETL / ELT pipelines, Data Lakes and Data Warehouses ● A good understanding of some of the following:


○ Hadoop / Spark

○ Apache Airflow

○ Apache Beam

○ Relational database technologies (Microsoft SQL, Oracle or PostgreSQL)

○ Teradata

○ Tensorflow

○ Analytical tools (Looker, Tableau, Qlik, Data Studio)
  • Significant experience in using languages like Python, R and SQL as well as Google database services like Firestore, Cloud SQL, Big Query or equivalents from other Cloud Providers
  • Experienced in using unit testing and CI / CD
  • Ability to think in abstract terms about application structures and strong understanding of security integration implications on this structure
  • A solid grounding working with common data modelling techniques and SQL (e.g.


MS SQL, Oracle, PostgreSQL, MySQL)
  • Working knowledge of data governance principles (e.g. data profiling, data cataloguing, managing access controls, etc.)
  • A passion for all things programming, databases, data, big and small
  • Experience using Git and related tooling


Desirable Skills:
  • Experience with or knowledge of Google Cloud Platform (e.g. Pub/Sub, Dataflow, BigQuery, Firestore, etc.)
  • Experience with or knowledge of AWS (e.g. Athena, EMR, Glue, Redshift, Kinesis, etc) or Azure (e.g. Synapse, Databricks, Data Factory, Event Hub, etc.)
  • Exposure to real time data ingestion / streaming data sources
  • Experience in or knowledge of working in an agile project based environment
  • Bachelor or Masters degree in Computer Science, Data Science or equivalent education


Location:

We have offices in Central Manchester and Utrecht, with options for regular remote working. This role could also be remote based in the UK or Netherlands, with occasional travel



For More details on this superb role with one of the stongest GCP names -

Contact - James Simmons at FRG Tech Consulting

020 31 48 4662

j.simmons@frgconsulting.com