Identify, scope and participate in the design and delivery of cloud data platform solutions, data storage, movement and orchestration solutions utilising GCP
Design and execute a platform modernisation approach for customers' data environments
Design, coordinate and execute pilots, prototypes or proof of concepts, provide validation on specific scenarios and provide deployment guidance
Collaborate and share best practices/insights with engineering colleagues and the architect community
Travel to client sites as appropriate (when restrictions allow)
Skills & Qualifications:
Spark. Hadoop, GCP (BigQuery/Dataproc), Python and SQL
Articulate communication skills to explain complex solutions to external or internal customers in a clear and concise manner, aligned with the ability to adjust style to a varied audience within an organisation.
Exposure to data science modelling, analytics & BI reporting (Tableau, Looker, Periscope, DataStudio).
History of working with Data warehouse solutions (on-premise & Public Cloud).
Analytical and design skills around the full end-to-end ETL lifecycle of data pipelines in large enterprise settings
Practical experience of data curation, cleansing, creation and maintenance of data sets for business purposes
Benefits
25 days holiday, with an increase to 28 with 1 day service awarded each year up to a max of 3 years
Holiday buy back scheme (upto 4 additional days per year)
Work from Anywhere policy. You can work from anywhere abroad (subject to government agreements) for up to 30 working days
4 x salary life assurance
4% contributory pension and Appsbroker pays in 4% as well
Opportunity to join our Vitality Health Insurance scheme (Appsbroker will cover the premium for yourself only, other plans available at costs)
Opportunity for flexible working and work from home