Ref: a0M1i00000JWa9AEAT

GCP Datalake & Reporting Engineer

England, Berkshire

Job description

GCP Datalake & Reporting Engineer

a0M1i00000JWa9AEAT



We are currently working with an exciting new client who are keen to bring in a GCP Datalake & Reporting Engineer to join their rapidly expanding team.



As an GCP specialist with a proven track record within Google Big data & Big Query, you will work closely with our development teams, architects, and DevOps engineers.



The team has a key responsibility for creating tools, automations, and accelerators for creating and managing Azure environments using code and, where possible, existing Azure services. The overall objective being to integrate GCP native services and data. Day to day this will involve a mix of developing, monitoring and expanding the Datalake; developing customer reports, insights and alerts; extending and integrating insights with Azure provided information. The focus of this effort will be on the effective monitoring and enhancement of SAP based customer estates and related resources on the GCP platform.



Day to day, this is some of what you'd be up to:
  • Identify solutions based on GCP technologies to collect and analyze SAP operating data.
  • Create and manage CI/CD pipelines to build and deploy any number of our reports/insights. These pipelines will be modularised and re-usable across similar solutions.
  • Contribute towards our library of high-quality documentation, both technical and user facing.
  • Assist with troubleshooting, maintenance, and routine update of current solutions.
  • Performance and Cost Management - evaluate existing applications and platforms and give recommendations for enhancing performance and reducing operational cost.




Qualifications

About you:

You're probably an architect, an analyst, or automator, and everything in between - that's exactly who we're looking for and you'll be joining a team of people just like you. SAP experience is a bonus but by no means required.

  • GCP: Our intention is to offer services to customers in the same cloud as where their resources are located making good general expertise here a must
  • Build and Deployment Tools: Azure DevOps is our platform of choice
  • Infrastructure as Code: Terraform is preferred here but experience with something like ARM is a good alternative
  • OpenSearch: our persistence of choice (having migrated from Elasticsearch), which all feeds back to one of several data lakes
  • Google Big Data, & Big Query: our system of record for analysis and insights




Technologies you'll have been exposed to
  • Analysis experience in SQL or Python
  • Knowledge of Agile methodologies
  • NoSQL and Relational Database Administration (OpenSearch/Elasticsearch and MSSQL)
  • Scripting languages, at least one of Powershell/Bash/Python
  • Ansible Playbook creation and Ansible Tower usage
  • Linux System Administration
  • Git