Your current job search

16 search results

For Contract

Analytics Engineer

England, London, City of London

  • £300 to £350 GBP
  • Engineer Role
  • Skills: r, python, azure, gcp, c++, sql, data governance, data quality, hadoop, spark, power bi, databricks, data modelling,
  • Seniority: Senior

Job description



Position: Analytics Engineer



Location: London (hybrid 3 times a week)
Department: IT
Type: 3 months rolling contract
Outside IR35


The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making.


Key Responsibilities

  • Design and build data products, with proficiency throughout the data lifecycle.
  • Develop robust data models through close collaboration with business users and the engineering team.
  • Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks.
  • Accountable for data extraction, transforming JSON & XML, utlising high experience within metadata management.
  • Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning.
  • Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis.
  • Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers.
  • To work independently and to manage multiple data tasks in a fast-paced environment
  • Create and maintain dashboards, visualizations, and reports using Power BI to enable data-driven decision-making
  • Ensure data quality and accuracy by implementing data validation, monitoring, and error-handling processes.



Requirements
  • At least 8 years' experience in data analytics, data engineering, software engineering, or a similar role
  • Expertise in developing best practices for data engineering and analytics with a strong background in data modelling including indexing strategies.
  • Strong ability in SQL for data extraction and manipulation, and proficiency in data modelling and data product building in Data Bricks.
  • Developed Azure cloud experience for data storage and processing, considerations made for alternate cloud providers such as AWS and GCP.
  • Substantial programming ability using languages/tools such as R, Python and C++ for data manipulation and scripting
  • Solid understanding of relevant data governance, data quality, and data security best practices
  • Strong problem-solving skills, and the ability to think critically and analytically
  • High experience in documentation and data dictionaries
  • Knowledge of big data technologies and distributed computing frameworks such as Hadoop and Spark
  • Excellent communication skills to effectively collaborate with cross-functional teams and present insights to business stakeholders


Please can you send me a copy of your CV if you're interested

Data Scientist - Contract

England, London, City of London

  • £350 to £400 GBP
  • Engineer Role
  • Skills: data science, Gen ai, Generative AI, Gemini, sdlc, azure, gcp, agile, uat, python
  • Seniority: Mid-level

Job description





Data Scientist - Contract


This position involves developing AI and data science solutions to address key business challenges in the insurance sector. The work includes close collaboration with underwriters, product managers, and other stakeholders to ensure meaningful business impact. Projects may include building a digital high net worth insurance product or a portfolio analysis tool.

Key Responsibilities
* Participate in all phases of the software development lifecycle
* Design and document solutions following internal delivery frameworks
* Adhere to coding, testing, and design standards and best practices
* Collaborate with testing teams during product testing and user acceptance
* Support deployment activities across environments
* Provide regular updates on progress, risks, and issues
* Troubleshoot and resolve operational issues promptly
* Pursue certifications to enhance software engineering skills

Candidate Profile
* Familiarity with the insurance industry
* Proficiency in Python for traditional data science
* Experience with generative AI tools, especially Gemini
* Knowledge of cloud platforms such as Azure or GCP
* Ability to interpret technical design documentation
* Strong communication skills across disciplines
* Experience working in Agile environments
* Effective time management and prioritisation
* Strong analytical and problem-solving capabilities
* Comfortable working in fast-paced, ambiguous settings
* Holds a degree in Information Technology or Computer Science

London 2 days a week
Outside IR35
Need someone immediately available

Please send me a copy of your CV is you're interested

new

Data Product Owner

England, Tyne and Wear, Newcastle upon Tyne

  • £400 to £450 GBP
  • Engineer Role
  • Skills: sla
  • Seniority: Senior

Job description



Data Product Owner

Outside IR35
  • Start Data ASAP
  • Contract Length: 6 months
  • Location: Newcastle Upon Tyne 1 day a week
  • SC Clearance preferable. BPSS with view to getting cleared


The Data Product Owner is accountable for maximising the business value of one or more data products, treating data as a product with clear consumers, measurable value, and defined service levels.

They act as the primary bridge between business stakeholders and delivery teams, owning the product vision, backlog prioritisation, and ongoing performance of the data product across its lifecycle. Each data product has a clearly identified owner who is commercially and operationally responsible for its success .

Key Responsibilities

Product Vision & Strategy
  • Define and communicate a clear product vision and roadmap for assigned data products, aligned to business outcomes and strategic priorities .
  • Ensure each data product is valuable, usable, feasible, and viable for its intended consumers .
  • Act as the accountable owner for the success (or failure) of the data product in delivering measurable business value .


Stakeholder Management
  • Serve as the primary point of contact for business stakeholders, data consumers, and delivery teams .
  • Translate business needs into clear, prioritised product requirements and outcomes.
  • Balance competing stakeholder demands while maintaining focus on agreed product goals.


Backlog Ownership & Prioritisation
  • Own and prioritise the product backlog, ensuring delivery teams are always working on the highest-value items .
  • Define user stories and acceptance criteria with a strong focus on consumer value and usability.
  • Make informed trade-offs between scope, quality, time, and cost.


Data Product Delivery Agreements & SLAs
  • Define, own, and maintain Data Product Delivery Agreements, including SLAs covering quality, timeliness, availability, and reliability .
  • Agree and track non-functional requirements such as refresh frequency, retention, and recovery expectations.
  • Ensure consumers clearly understand intended use, limitations, and notice-of-change commitments.


Value Measurement & Performance Monitoring
  • Define success metrics and value measures for each data product, including usage, engagement, and business impact .
  • Monitor product performance and data quality, initiating corrective action where standards are not met.
  • Regularly review product value and make evidence-based decisions on enhancement, maintenance, or retirement.


Data Quality, Ethics & Compliance
  • Ensure data products meet agreed data quality thresholds and are trustworthy for decision-making .
  • Embed regulatory, privacy, and ethical considerations into product design and operation .
  • Act as a steward for responsible data use within the product domain.


Cross-Functional Leadership
  • Work closely with engineers, analytics engineers, platform teams, and UX to deliver high-quality data products .
  • Champion a "data as a product" mindset across business and technical teams.
  • Contribute to the evolution of product management and data product practices across the organisation.


Required Experience & Skills

Essential
  • Proven experience as a Product Owner, Product Manager, or similar role in a data, analytics, or platform environment.
  • Strong understanding of data and analytics concepts (e.g. data products, data quality, analytics use cases).
  • Experience owning and prioritising backlogs and working with cross-functional delivery teams.
  • Excellent stakeholder management and communication skills.
  • Ability to define and measure business value.


Desirable
  • Experience working with data platforms, analytics products, or data mesh-style ownership models .
  • Familiarity with defining SLAs, service promises, or delivery agreements for data products.
  • Experience operating in regulated or ethically sensitive data environments.
  • Exposure to Agile and product management frameworks.


Please send me a copy pf your CV if you're interested?

Azure DevOps Engineer

Finland

  • Negotiable
  • Engineer Role
  • Skills: Azure, DevOps
  • Seniority: Mid-level

Job description



Job Description

I'm supporting a client who is seeking an experienced Finnish-speaking Azure DevOps Engineer with a strong focus on identity, access management, and secure cloud operations within Azure. This role will play a key part in ensuring platforms are scalable, automated, and aligned with security best practices.



Key Responsibilities



  • Design, build, and maintain CI/CD pipelines using Azure DevOps


  • Implement and manage Azure identity and access management, with a strong focus on Azure AD / Entra ID


  • Define and maintain RBAC, service principals, managed identities, and access policies across environments


  • Automate Azure infrastructure using Infrastructure as Code (Terraform preferred; Bicep/ARM acceptable)


  • Support secure application deployments across multiple environments


  • Work closely with engineering and security teams to enforce least-privilege access and governance standards


  • Monitor, troubleshoot, and improve platform reliability and security


  • Maintain documentation, access models, and operational runbooks




Technical Requirements



  • Strong hands-on experience with Microsoft Azure in production environments


  • Proven experience with Azure DevOps (pipelines, repos, artifacts)


  • Deep understanding of Azure AD / Entra ID, IAM concepts, and access governance


  • Experience implementing RBAC, managed identities, service principals, and conditional access


  • Infrastructure as Code experience (Terraform preferred)


  • Experience with containers and orchestration (Docker, Kubernetes / AKS)


  • Scripting experience (PowerShell, Bash, or similar)


  • Familiarity with monitoring and logging tools (Azure Monitor, Log Analytics, Application Insights)




Nice to Have



  • Experience working in regulated or security-conscious environments


  • Knowledge of Azure Policy, PIM, and access reviews


  • Experience supporting microservices architectures


  • Background working closely with security or platform teams




Analytics Solution Engineer

England, London, City of London

  • £400 to £450 GBP
  • Engineer Role
  • Skills: Azure Databricks * Git / GitHub * DevOps & Cloud Operations * Azure Functions * Data Streaming * PySpark, Python, SQL * Automation & Public Cloud * Cloud Infrastructure * Terraform * Palantir Foundry * PowerBI
  • Seniority: Senior

Job description



Analytics Solution Engineer- Contract Opportunity

Remote
Inside IR35
6 Months

Responsibilities
* Managing and resolving platform issues, alerts, and incidents
* Maintaining platform artefacts including secrets, clusters, and release components across Palantir Foundry and Azure Databricks
* Supporting Data Engineers with pipeline creation and platform usage
* Implementing Terraform modules, performing health checks, managing secrets, and overseeing release processes
* Conducting GitHub peer reviews and taking ownership of software components
* Ensuring reliability and performance of Databricks and Palantir Foundry platforms
* Delivering DevOps work to build analytics infrastructure, support redlines, and improve processes
* Performing audit data analysis, data restore operations, immutable backups, networking and access configuration
* Managing PowerBI gateways, VM reservations, Event Hub TUs, serverless compute clusters, and Terraform deployments

Required Skills
* Azure Databricks
* Git / GitHub
* DevOps & Cloud Operations
* Azure Functions
* Data Streaming
* PySpark, Python, SQL
* Automation & Public Cloud
* Cloud Infrastructure
* Terraform
* Palantir Foundry
* PowerBI
* Software Lifecycle Management

If you meet the requirements please send me a copy of your

Azure Databricks Engineer - Contract

England, London

  • £450 to £500 GBP
  • Engineer Role
  • Skills: Azure, Databricks, azure, python, sql, data quality,
  • Seniority: Senior

Job description



Azure Data Engineer (Databricks Specialist)


About the Role

We are seeking an experienced Azure Data Engineer with a strong focus on Databricks to join our team on a 3‑month contract. This role is fully remote and outside IR35, offering flexibility and competitive day rates. You will play a key role in designing, building, and optimising scalable data solutions in Azure, with Databricks at the core of the project.


Responsibilities
* Design and implement data pipelines using Azure Databricks.
* Collaborate with developers, analysts, and DevOps teams to ensure smooth data integration.
* Optimise performance of large‑scale data processing workloads.
* Work with stakeholders to understand application state and data requirements.
* Ensure best practices in data governance, security, and compliance.


Key Skills & Experience
* Proven experience as an Azure Data Engineer.
* Strong hands‑on expertise with Databricks - 5+ years experience (PySpark, notebooks, clusters, Delta Lake).
* Solid knowledge of Azure services (Data Lake, Synapse, Data Factory, Event Hub).
* Experience working with DevOps teams and CI/CD pipelines.
* Ability to handle environments with large user bases (e.g., 300+ internal users).
* Excellent communication skills and ability to work independently in a remote setting.


Contract Details
* Length: 3 months (then rolling)
* Rate: £450-£500 per day (Outside IR35)
* Status: Outside IR35
* Location: Fully Remote

AI Evangelist

England, London, City of London

  • £850 to £950 GBP
  • Engineer Role
  • Skills: RAG, Langraph, ai, C++, Java, C# Python SQL, Devops, typescripit
  • Seniority: Senior

Job description



Role: AI Evangelist (Hands-On)


A senior technical and advocacy role focused on bridging advanced AI technologies with practical business needs in a financial organisation. The position combines hands-on development with stakeholder education and strategic influence.


Core Responsibilities

  • Build and demonstrate AI-powered solutions for financial applications (e.g., trading, investment banking, insurance).
  • Translate complex AI concepts into business value for technical and non-technical stakeholders.
  • Conduct workshops and training to promote AI literacy and upskill teams.
  • Author technical blogs, white papers, and internal documentation.
  • Advise senior leadership on AI strategies and compliance.
  • Represent the organization at industry events and forums.
  • Ensure AI solutions meet regulatory and ethical standards.
  • Prototype and deploy AI models for forecasting, customer insights, underwriting, and fraud detection.



Technical Responsibilities

  • Design and manage agentic AI architectures and generative code systems.
  • Oversee code reviews and maintain standards for AI-assisted development.
  • Develop testing and validation protocols for AI-generated code.
  • Mentor technical teams and establish best practices.
  • Collaborate with cross-functional teams for safe AI adoption.



Qualifications

  • Degree in Computer Science, Data Science, Finance, or related field.
  • Strong programming skills (Python, SQL, C++, Java, etc.).
  • Experience with AI/ML frameworks (TensorFlow, PyTorch).
  • Minimum 4+ years in AI roles within finance or consulting.
  • Knowledge of AI ethics, compliance, and data privacy.
  • Excellent communication and stakeholder engagement skills.



Desired Traits

  • Passion for innovation in regulated environments.
  • Strong problem-solving and technical storytelling abilities.
  • Commitment to ethical AI deployment.



Key Skills

  • Advanced Python and backend languages.
  • Expertise in prompt engineering, fine-tuning, RAG, and agentic design.
  • Familiarity with AI tools (GitHub Copilot, ChatGPT, SonarQube, LangChain, etc.).
  • Understanding of observability, security, and compliance in AI systems.
  • Experience with containerization and cloud deployments.
5 Days on-site in the London Office (Blackfriars)
Inside IR35
12 month contract


Please send me a copy of your CV if you're interested


Freelance Azure Data Engineer / Architect

Finland, Helsinki

  • Negotiable
  • Engineer Role
  • Skills: Azure, Data, Databricks
  • Seniority: Mid-level

Job description



Job Description

My customer is seeking an experienced Freelance Azure Data Engineer with Data Architecture experience to design, build, and optimise enterprise-scale data platforms on Microsoft Azure.

This role is hands-on but requires strong architectural ownership - you'll be responsible for shaping data solutions end to end, from ingestion through to analytics and consumption.

Finnish-speaking is mandatory.



Key Responsibilities



  • Design and implement scalable Azure-based data architectures


  • Build and maintain reliable data pipelines (batch and streaming)


  • Lead architectural decisions across data ingestion, storage, and analytics


  • Develop data models optimised for performance and analytics use cases


  • Ensure best practices around data governance, security, and compliance


  • Optimise cost, performance, and reliability of Azure data platforms


  • Collaborate closely with engineers, analysts, and business stakeholders




Required Experience



  • Strong background as a Data Engineer with architect-level responsibility


  • Proven experience designing and delivering data platforms on Microsoft Azure


  • Hands-on experience with:


    • Azure Data Factory


    • Azure Synapse Analytics


    • Azure Databricks


    • Azure Data Lake (Gen2)


  • Strong SQL and data modelling expertise


  • Experience with CI/CD and infrastructure-as-code in Azure


  • Ability to work independently in a freelance/contract environment




Nice to Have



  • Streaming experience (Azure Event Hubs, Kafka)


  • Azure Purview / Microsoft Fabric experience


  • Terraform or Bicep


  • Experience in large enterprise or regulated environments