Your current job search

15 search results

For Contract

Analytics Engineer

England, London, City of London

  • £300 to £350 GBP
  • Engineer Role
  • Skills: r, python, azure, gcp, c++, sql, data governance, data quality, hadoop, spark, power bi, databricks, data modelling,
  • Seniority: Senior

Job description



Position: Analytics Engineer



Location: London (hybrid 3 times a week)
Department: IT
Type: 3 months rolling contract
Outside IR35


The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making.


Key Responsibilities

  • Design and build data products, with proficiency throughout the data lifecycle.
  • Develop robust data models through close collaboration with business users and the engineering team.
  • Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks.
  • Accountable for data extraction, transforming JSON & XML, utlising high experience within metadata management.
  • Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning.
  • Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis.
  • Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers.
  • To work independently and to manage multiple data tasks in a fast-paced environment
  • Create and maintain dashboards, visualizations, and reports using Power BI to enable data-driven decision-making
  • Ensure data quality and accuracy by implementing data validation, monitoring, and error-handling processes.



Requirements
  • At least 8 years' experience in data analytics, data engineering, software engineering, or a similar role
  • Expertise in developing best practices for data engineering and analytics with a strong background in data modelling including indexing strategies.
  • Strong ability in SQL for data extraction and manipulation, and proficiency in data modelling and data product building in Data Bricks.
  • Developed Azure cloud experience for data storage and processing, considerations made for alternate cloud providers such as AWS and GCP.
  • Substantial programming ability using languages/tools such as R, Python and C++ for data manipulation and scripting
  • Solid understanding of relevant data governance, data quality, and data security best practices
  • Strong problem-solving skills, and the ability to think critically and analytically
  • High experience in documentation and data dictionaries
  • Knowledge of big data technologies and distributed computing frameworks such as Hadoop and Spark
  • Excellent communication skills to effectively collaborate with cross-functional teams and present insights to business stakeholders


Please can you send me a copy of your CV if you're interested

Data Scientist - Contract

England, London, City of London

  • £350 to £400 GBP
  • Engineer Role
  • Skills: data science, Gen ai, Generative AI, Gemini, sdlc, azure, gcp, agile, uat, python
  • Seniority: Mid-level

Job description





Data Scientist - Contract


This position involves developing AI and data science solutions to address key business challenges in the insurance sector. The work includes close collaboration with underwriters, product managers, and other stakeholders to ensure meaningful business impact. Projects may include building a digital high net worth insurance product or a portfolio analysis tool.

Key Responsibilities
* Participate in all phases of the software development lifecycle
* Design and document solutions following internal delivery frameworks
* Adhere to coding, testing, and design standards and best practices
* Collaborate with testing teams during product testing and user acceptance
* Support deployment activities across environments
* Provide regular updates on progress, risks, and issues
* Troubleshoot and resolve operational issues promptly
* Pursue certifications to enhance software engineering skills

Candidate Profile
* Familiarity with the insurance industry
* Proficiency in Python for traditional data science
* Experience with generative AI tools, especially Gemini
* Knowledge of cloud platforms such as Azure or GCP
* Ability to interpret technical design documentation
* Strong communication skills across disciplines
* Experience working in Agile environments
* Effective time management and prioritisation
* Strong analytical and problem-solving capabilities
* Comfortable working in fast-paced, ambiguous settings
* Holds a degree in Information Technology or Computer Science

London 2 days a week
Outside IR35
Need someone immediately available

Please send me a copy of your CV is you're interested

new

Azure DevOps Engineer

Finland

  • Negotiable
  • Engineer Role
  • Skills: Azure, DevOps
  • Seniority: Mid-level

Job description



Job Description

I'm supporting a client who is seeking an experienced Finnish-speaking Azure DevOps Engineer with a strong focus on identity, access management, and secure cloud operations within Azure. This role will play a key part in ensuring platforms are scalable, automated, and aligned with security best practices.



Key Responsibilities



  • Design, build, and maintain CI/CD pipelines using Azure DevOps


  • Implement and manage Azure identity and access management, with a strong focus on Azure AD / Entra ID


  • Define and maintain RBAC, service principals, managed identities, and access policies across environments


  • Automate Azure infrastructure using Infrastructure as Code (Terraform preferred; Bicep/ARM acceptable)


  • Support secure application deployments across multiple environments


  • Work closely with engineering and security teams to enforce least-privilege access and governance standards


  • Monitor, troubleshoot, and improve platform reliability and security


  • Maintain documentation, access models, and operational runbooks




Technical Requirements



  • Strong hands-on experience with Microsoft Azure in production environments


  • Proven experience with Azure DevOps (pipelines, repos, artifacts)


  • Deep understanding of Azure AD / Entra ID, IAM concepts, and access governance


  • Experience implementing RBAC, managed identities, service principals, and conditional access


  • Infrastructure as Code experience (Terraform preferred)


  • Experience with containers and orchestration (Docker, Kubernetes / AKS)


  • Scripting experience (PowerShell, Bash, or similar)


  • Familiarity with monitoring and logging tools (Azure Monitor, Log Analytics, Application Insights)




Nice to Have



  • Experience working in regulated or security-conscious environments


  • Knowledge of Azure Policy, PIM, and access reviews


  • Experience supporting microservices architectures


  • Background working closely with security or platform teams




new

Data Platform Engineer - Azure

Finland, Helsinki

  • Negotiable
  • Engineer Role
  • Skills: Azure, Data, Synapse
  • Seniority: Mid-level

Job description



Job Description

I am looking for a Freelance, Finnish-speaking, Azure Data Platform Engineer to take ownership of a modern Azure-based data platform, with a focus on standardisation, security, and operational excellence. This role sits at the platform level, enabling multiple data teams to deliver data products quickly, securely, and consistently.



Key Responsibilities



  • Design, build, and evolve a standardised Data Landing Zone / Lakehouse platform in Azure


  • Create reusable frameworks, patterns, and "golden paths" to enable uniform ways of working across data teams


  • Build metadata- and configuration-driven ingestion frameworks to onboard new data sources efficiently


  • Ensure strong security and governance across the platform, including identity and access management (RBAC, managed identities, Key Vault, policies, logging)


  • Establish and maintain CI/CD pipelines, deployment patterns, and environment structures (dev/test/prod, multiple domains)


  • Own platform operability, including monitoring, incident management, and continuous improvement


  • Drive cost transparency and optimisation (FinOps), performance tuning, and resource efficiency


  • Support and enable data teams through documentation, templates, onboarding flows, and technical guidance




Required Experience



  • Strong hands-on experience with Azure data services such as Synapse Analytics, Azure Data Factory, and ADLS Gen2


  • Solid programming skills in Python (and/or C#/.NET) and strong SQL


  • Experience with Infrastructure as Code using Bicep and/or Terraform


  • CI/CD experience with Azure DevOps Pipelines or GitHub Actions


  • Good understanding of Lakehouse architecture, metadata-driven ingestion, and data governance concepts


  • Experience implementing secure, reliable platforms using Azure IAM, managed identities, and network controls




new

AI Engineer

England, London, City of London

  • £250 to £300 GBP
  • Engineer Role
  • Skills: AWS, Bedrock, llm, Claude, lambda, ai, langraph, rag, gpt, sdk
  • Seniority: Senior

Job description



AI Engineer

Required Technical Skills

AWS & Generative AI:
- AWS Bedrock experience (model selection, deployment, prompt engineering)
- Agentic workflow experience ideally based on AWS AgentCore
- Multi-agent orchestration frameworks (AWS Strands Agents, LangGraph, or similar)
- Large Language Model (LLM) integration and fine-tuning
- Experience with Claude, GPT-4, or similar foundation models
- Prompt engineering and chain-of-thought reasoning

RAG & Knowledge Systems:
- Retrieval-Augmented Generation (RAG) pipeline implementation
- Vector database experience (CockroachDB, Pinecone, or similar)
- Embedding model selection and optimisation
- Semantic search and similarity matching
- Context window management and chunking strategies

MCP & Tool Integration:
- Model Context Protocol (MCP) implementation
- Tool calling and function integration with LLMs
- API design for LLM tool interfaces
- AWS Lambda integration with agents

AI Safety & Evaluation:
- Guardrail implementation (hallucination detection, toxicity filtering)
- Response evaluation framework design
- A/B testing for AI systems
- Metrics definition (accuracy, latency, user satisfaction)

Programming & Development:
- Python (primary) - advanced level
- AWS SDK (boto3)
- Infrastructure as Code awareness (CloudFormation/Terraform)
- Git version control
- CI/CD integration for ML systems

Start 13th Jan

14 week Contract

Fully remote

Please send me your CV if you're interested

Databrick Platform Engineer - Outside IR35 Contract

England, London, City of London

  • £400 to £500 GBP
  • Engineer Role
  • Skills: Azure Databricks, MS Business Intelligence, platform, terraform, devops, databricks, data engineer, devops engineer, github, azure, cloud
  • Seniority: Mid-level

Job description

Databricks Platform Engineer - Outside IR35 Contract



Please note - this role will require you to be based in the UK with the unrestricted right to work in the UK. This organisation is not able to offer sponsorship.

An experienced Platform Engineer is required to support the rollout and maintenance of a large, federated, multi-service Data and AI platform. The role focuses on building secure, scalable, and cost-optimised cloud infrastructure, implementing governance, and enabling advanced data services.

Key Responsibilities

  • Build and maintain Azure infrastructure using Terraform.
  • Implement governance and compliance controls, including Azure Policy.
  • Configure and manage Azure Private Link and Private Link Service.
  • Enable advanced data services with Databricks and Unity Catalog for data governance.
  • Develop CI/CD pipelines using GitHub Actions.
  • Ensure security best practices, including Defense in Depth, BCDR, and high availability.
  • Manage identity and access, including OAuth and federated credentials.
  • Optimise cloud costs and monitor usage effectively.

Essential Skills

  • Proven experience with Terraform for Azure infrastructure as code.
  • Strong knowledge of GitHub Actions and CI/CD principles.
  • Strong experience with Databricks and Unity Catalog for advanced data governance.
  • Hands-on experience with Azure Private Link and Private Link Service.
  • Strong understanding of Azure security best practices and compliance.
  • Experience in identity and access management.
  • Familiarity with Azure cost management strategies.

Additional Details

  • Must be available to start in November.
  • Outside IR35 contract.
  • Home-based.
  • 3-month initial term with potential for extension.
  • Competitive day rate: £400-£500.


To apply for this role please submit your CV or contact David Airey on 0191 338 7508 or at d.airey@tenthrevolution.com. Interviews will begin this week.

Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

Data Mapping Engineer - Senior Engineer

England, London, City of London

  • £300 to £350 GBP
  • Engineer Role
  • Skills: maximo, DataStage, IBM Maximo Asset Management Software, Palantir, XML tooling, XSLT, & DataStage Designer databases, XML, JSON, CSV, EDI, data mapping, data mapping, data transformation, data integration
  • Seniority: Senior

Job description



Data Mapping Engineer - UK Based (Security Clearance Required)


We are seeking an experienced Data Mapping Engineer to design and implement robust data mapping solutions across diverse systems and formats. This role involves working with structured and semi-structured data, ensuring accurate transformation and integration, and supporting broader data migration efforts. Candidates must have hands-on experience with IBM Maximo Asset Management software and meet UK security clearance requirements.


Key Responsibilities:
* Analyse and interpret source and target data structures (e.g., databases, XML, JSON, CSV, EDI)
* Define and document data mappings, transformation logic, and integration rules
* Collaborate with data architects, analysts, and subject matter experts to ensure semantic consistency
* Validate mapping outputs and resolve data discrepancies
* Ensure compliance with data quality, privacy, and security standards
* Maintain documentation and metadata for traceability and governance
* Support knowledge sharing and contribute to best practices within the team


Essential Requirements:
* UK national with at least five years of residency (required for security clearance)
* Proven experience in data modelling at conceptual and logical levels
* Background in data mapping, transformation, or integration (e.g., technical business analyst or data analyst)
* Proficiency in working with data formats such as XML, JSON, CSV, EDI, XLSX
* Strong communication and collaboration skills
* Fluent in English


Desirable Skills:
* Familiarity with ETL tools or data integration platforms (e.g., IBM DataStage, Palantir)
* Experience with documentation tools and specifications (e.g., Excel-based mapping documents, BPMN, UML)
* Proficiency in XML tooling, XSLT, and DataStage Designer
* Experience with IBM Maximo Asset Management software

Inside IR35
Fully Remote


This is a UK-based role requiring security clearance. Applicants must meet nationality and 5 years residency and residency criteria.

Please send me a copy of your CV if you're interested

Monitoring and Analytics_Security Engineer - CGEMJP00312111

England, Cheshire, Knutsford

  • £550 to £600 GBP
  • Engineer Role
  • Skills: GitLab, Kubernetes, Openshift, CI/CD, aws, azure, Kubernetes, Openshift, gitlab, Cribl, Elastic, Splunk, Fluentd
  • Seniority: Senior

Job description



Monitoring an Analytics Security Engineer (Contract, PAYE via Umbrella)


A cutting-edge engineering team is seeking a skilled Security Engineer to support the development of a telemetry pipeline MVP. This contract role demands deep expertise in containerised environments, observability tooling, and secure infrastructure design. The successful candidate will embed security across the pipeline architecture and collaborate closely with DevOps and development teams.


Key Responsibilities
* Design and implement security controls across Kubernetes and OpenShift environments
* Ensure secure configuration and access management within GitLab and CI/CD pipelines
* Integrate and secure telemetry tools including Cribl, Elastic, Splunk, Fluentd, and Syslog
* Conduct threat modeling, vulnerability assessments, and risk analysis
* Collaborate with DevOps engineers to embed security into infrastructure-as-code workflows
* Monitor and respond to security events from observability platforms
* Maintain documentation of security architecture, policies, and incident response procedures


Required Skills & Experience
* Strong hands-on experience with Kubernetes and OpenShift in secure production environments
* Proficiency in GitLab and secure CI/CD pipeline practices
* Familiarity with telemetry and logging tools: Cribl, Elastic, Splunk, Fluentd, Syslog
* Deep understanding of networking protocols, firewalls, VPNs, and security principles
* Experience with security frameworks (e.g., NIST, ISO 27001) and compliance requirements
* Knowledge of container security tools (e.g., Aqua, Twistlock, Trivy) and vulnerability scanners
* Excellent analytical and communication skills


Preferred Qualifications
* Certifications such as CISSP, CISM, CKS, or equivalent
* Experience in building MVPs or working in startup-like environments
* Familiarity with cloud security across AWS, Azure, or GCP


Additional Details
* Contract engagement must be operated via PAYE through an umbrella company

Hybrid

Inside IR35

Knutsford or Glasgow Hybrid

Please send me a copy of your CV if you're interested

Azure Databricks Engineer - Contract

England, London

  • £450 to £500 GBP
  • Engineer Role
  • Skills: Azure, Databricks, azure, python, sql, data quality,
  • Seniority: Senior

Job description



Azure Data Engineer (Databricks Specialist)


About the Role

We are seeking an experienced Azure Data Engineer with a strong focus on Databricks to join our team on a 3‑month contract. This role is fully remote and outside IR35, offering flexibility and competitive day rates. You will play a key role in designing, building, and optimising scalable data solutions in Azure, with Databricks at the core of the project.


Responsibilities
* Design and implement data pipelines using Azure Databricks.
* Collaborate with developers, analysts, and DevOps teams to ensure smooth data integration.
* Optimise performance of large‑scale data processing workloads.
* Work with stakeholders to understand application state and data requirements.
* Ensure best practices in data governance, security, and compliance.


Key Skills & Experience
* Proven experience as an Azure Data Engineer.
* Strong hands‑on expertise with Databricks - 5+ years experience (PySpark, notebooks, clusters, Delta Lake).
* Solid knowledge of Azure services (Data Lake, Synapse, Data Factory, Event Hub).
* Experience working with DevOps teams and CI/CD pipelines.
* Ability to handle environments with large user bases (e.g., 300+ internal users).
* Excellent communication skills and ability to work independently in a remote setting.


Contract Details
* Length: 3 months (then rolling)
* Rate: £450-£500 per day (Outside IR35)
* Status: Outside IR35
* Location: Fully Remote