Your current job search

16 search results

For Contract

Analytics Engineer

England, London, City of London

  • £300 to £350 GBP
  • Engineer Role
  • Skills: r, python, azure, gcp, c++, sql, data governance, data quality, hadoop, spark, power bi, databricks, data modelling,
  • Seniority: Senior

Job description



Position: Analytics Engineer



Location: London (hybrid 3 times a week)
Department: IT
Type: 3 months rolling contract
Outside IR35


The client is seeking an experienced Analytics Engineer to join its Data & Analytics team. The role focuses on building scalable data pipelines, transforming raw data into clean, reusable datasets, and enabling data-driven decision-making.


Key Responsibilities

  • Design and build data products, with proficiency throughout the data lifecycle.
  • Develop robust data models through close collaboration with business users and the engineering team.
  • Partner with senior management, operational leads, and other stakeholders, coaching and supporting a data-driven culture, including KPI definition and reporting frameworks.
  • Accountable for data extraction, transforming JSON & XML, utlising high experience within metadata management.
  • Collaborate with data engineers to develop data, enrich product design and integrate data for predictive models or machine learning.
  • Deliver well-defined, transformed, tested, documented, and code-reviewed datasets for analysis.
  • Evaluate and recommend improvements in data flow, influencing and supporting architects and engineers.
  • To work independently and to manage multiple data tasks in a fast-paced environment
  • Create and maintain dashboards, visualizations, and reports using Power BI to enable data-driven decision-making
  • Ensure data quality and accuracy by implementing data validation, monitoring, and error-handling processes.



Requirements
  • At least 8 years' experience in data analytics, data engineering, software engineering, or a similar role
  • Expertise in developing best practices for data engineering and analytics with a strong background in data modelling including indexing strategies.
  • Strong ability in SQL for data extraction and manipulation, and proficiency in data modelling and data product building in Data Bricks.
  • Developed Azure cloud experience for data storage and processing, considerations made for alternate cloud providers such as AWS and GCP.
  • Substantial programming ability using languages/tools such as R, Python and C++ for data manipulation and scripting
  • Solid understanding of relevant data governance, data quality, and data security best practices
  • Strong problem-solving skills, and the ability to think critically and analytically
  • High experience in documentation and data dictionaries
  • Knowledge of big data technologies and distributed computing frameworks such as Hadoop and Spark
  • Excellent communication skills to effectively collaborate with cross-functional teams and present insights to business stakeholders


Please can you send me a copy of your CV if you're interested

Data Scientist - Contract

England, London, City of London

  • £350 to £400 GBP
  • Engineer Role
  • Skills: data science, Gen ai, Generative AI, Gemini, sdlc, azure, gcp, agile, uat, python
  • Seniority: Mid-level

Job description





Data Scientist - Contract


This position involves developing AI and data science solutions to address key business challenges in the insurance sector. The work includes close collaboration with underwriters, product managers, and other stakeholders to ensure meaningful business impact. Projects may include building a digital high net worth insurance product or a portfolio analysis tool.

Key Responsibilities
* Participate in all phases of the software development lifecycle
* Design and document solutions following internal delivery frameworks
* Adhere to coding, testing, and design standards and best practices
* Collaborate with testing teams during product testing and user acceptance
* Support deployment activities across environments
* Provide regular updates on progress, risks, and issues
* Troubleshoot and resolve operational issues promptly
* Pursue certifications to enhance software engineering skills

Candidate Profile
* Familiarity with the insurance industry
* Proficiency in Python for traditional data science
* Experience with generative AI tools, especially Gemini
* Knowledge of cloud platforms such as Azure or GCP
* Ability to interpret technical design documentation
* Strong communication skills across disciplines
* Experience working in Agile environments
* Effective time management and prioritisation
* Strong analytical and problem-solving capabilities
* Comfortable working in fast-paced, ambiguous settings
* Holds a degree in Information Technology or Computer Science

London 2 days a week
Outside IR35
Need someone immediately available

Please send me a copy of your CV is you're interested

Azure DevOps Engineer

Finland

  • Negotiable
  • Engineer Role
  • Skills: Azure, DevOps
  • Seniority: Mid-level

Job description



Job Description

I'm supporting a client who is seeking an experienced Finnish-speaking Azure DevOps Engineer with a strong focus on identity, access management, and secure cloud operations within Azure. This role will play a key part in ensuring platforms are scalable, automated, and aligned with security best practices.



Key Responsibilities



  • Design, build, and maintain CI/CD pipelines using Azure DevOps


  • Implement and manage Azure identity and access management, with a strong focus on Azure AD / Entra ID


  • Define and maintain RBAC, service principals, managed identities, and access policies across environments


  • Automate Azure infrastructure using Infrastructure as Code (Terraform preferred; Bicep/ARM acceptable)


  • Support secure application deployments across multiple environments


  • Work closely with engineering and security teams to enforce least-privilege access and governance standards


  • Monitor, troubleshoot, and improve platform reliability and security


  • Maintain documentation, access models, and operational runbooks




Technical Requirements



  • Strong hands-on experience with Microsoft Azure in production environments


  • Proven experience with Azure DevOps (pipelines, repos, artifacts)


  • Deep understanding of Azure AD / Entra ID, IAM concepts, and access governance


  • Experience implementing RBAC, managed identities, service principals, and conditional access


  • Infrastructure as Code experience (Terraform preferred)


  • Experience with containers and orchestration (Docker, Kubernetes / AKS)


  • Scripting experience (PowerShell, Bash, or similar)


  • Familiarity with monitoring and logging tools (Azure Monitor, Log Analytics, Application Insights)




Nice to Have



  • Experience working in regulated or security-conscious environments


  • Knowledge of Azure Policy, PIM, and access reviews


  • Experience supporting microservices architectures


  • Background working closely with security or platform teams




new

Analytics Solution Engineer

England, London, City of London

  • £400 to £450 GBP
  • Engineer Role
  • Skills: Azure Databricks * Git / GitHub * DevOps & Cloud Operations * Azure Functions * Data Streaming * PySpark, Python, SQL * Automation & Public Cloud * Cloud Infrastructure * Terraform * Palantir Foundry * PowerBI
  • Seniority: Senior

Job description



Analytics Solution Engineer- Contract Opportunity

Remote
Inside IR35
6 Months

Responsibilities
* Managing and resolving platform issues, alerts, and incidents
* Maintaining platform artefacts including secrets, clusters, and release components across Palantir Foundry and Azure Databricks
* Supporting Data Engineers with pipeline creation and platform usage
* Implementing Terraform modules, performing health checks, managing secrets, and overseeing release processes
* Conducting GitHub peer reviews and taking ownership of software components
* Ensuring reliability and performance of Databricks and Palantir Foundry platforms
* Delivering DevOps work to build analytics infrastructure, support redlines, and improve processes
* Performing audit data analysis, data restore operations, immutable backups, networking and access configuration
* Managing PowerBI gateways, VM reservations, Event Hub TUs, serverless compute clusters, and Terraform deployments

Required Skills
* Azure Databricks
* Git / GitHub
* DevOps & Cloud Operations
* Azure Functions
* Data Streaming
* PySpark, Python, SQL
* Automation & Public Cloud
* Cloud Infrastructure
* Terraform
* Palantir Foundry
* PowerBI
* Software Lifecycle Management

If you meet the requirements please send me a copy of your

Data Mapping Engineer - Senior Engineer

England, London, City of London

  • £300 to £350 GBP
  • Engineer Role
  • Skills: maximo, DataStage, IBM Maximo Asset Management Software, Palantir, XML tooling, XSLT, & DataStage Designer databases, XML, JSON, CSV, EDI, data mapping, data mapping, data transformation, data integration
  • Seniority: Senior

Job description



Data Mapping Engineer - UK Based (Security Clearance Required)


We are seeking an experienced Data Mapping Engineer to design and implement robust data mapping solutions across diverse systems and formats. This role involves working with structured and semi-structured data, ensuring accurate transformation and integration, and supporting broader data migration efforts. Candidates must have hands-on experience with IBM Maximo Asset Management software and meet UK security clearance requirements.


Key Responsibilities:
* Analyse and interpret source and target data structures (e.g., databases, XML, JSON, CSV, EDI)
* Define and document data mappings, transformation logic, and integration rules
* Collaborate with data architects, analysts, and subject matter experts to ensure semantic consistency
* Validate mapping outputs and resolve data discrepancies
* Ensure compliance with data quality, privacy, and security standards
* Maintain documentation and metadata for traceability and governance
* Support knowledge sharing and contribute to best practices within the team


Essential Requirements:
* UK national with at least five years of residency (required for security clearance)
* Proven experience in data modelling at conceptual and logical levels
* Background in data mapping, transformation, or integration (e.g., technical business analyst or data analyst)
* Proficiency in working with data formats such as XML, JSON, CSV, EDI, XLSX
* Strong communication and collaboration skills
* Fluent in English


Desirable Skills:
* Familiarity with ETL tools or data integration platforms (e.g., IBM DataStage, Palantir)
* Experience with documentation tools and specifications (e.g., Excel-based mapping documents, BPMN, UML)
* Proficiency in XML tooling, XSLT, and DataStage Designer
* Experience with IBM Maximo Asset Management software

Inside IR35
Fully Remote


This is a UK-based role requiring security clearance. Applicants must meet nationality and 5 years residency and residency criteria.

Please send me a copy of your CV if you're interested

Azure Databricks Engineer - Contract

England, London

  • £450 to £500 GBP
  • Engineer Role
  • Skills: Azure, Databricks, azure, python, sql, data quality,
  • Seniority: Senior

Job description



Azure Data Engineer (Databricks Specialist)


About the Role

We are seeking an experienced Azure Data Engineer with a strong focus on Databricks to join our team on a 3‑month contract. This role is fully remote and outside IR35, offering flexibility and competitive day rates. You will play a key role in designing, building, and optimising scalable data solutions in Azure, with Databricks at the core of the project.


Responsibilities
* Design and implement data pipelines using Azure Databricks.
* Collaborate with developers, analysts, and DevOps teams to ensure smooth data integration.
* Optimise performance of large‑scale data processing workloads.
* Work with stakeholders to understand application state and data requirements.
* Ensure best practices in data governance, security, and compliance.


Key Skills & Experience
* Proven experience as an Azure Data Engineer.
* Strong hands‑on expertise with Databricks - 5+ years experience (PySpark, notebooks, clusters, Delta Lake).
* Solid knowledge of Azure services (Data Lake, Synapse, Data Factory, Event Hub).
* Experience working with DevOps teams and CI/CD pipelines.
* Ability to handle environments with large user bases (e.g., 300+ internal users).
* Excellent communication skills and ability to work independently in a remote setting.


Contract Details
* Length: 3 months (then rolling)
* Rate: £450-£500 per day (Outside IR35)
* Status: Outside IR35
* Location: Fully Remote

AI Evangelist

England, London, City of London

  • £850 to £950 GBP
  • Engineer Role
  • Skills: RAG, Langraph, ai, C++, Java, C# Python SQL, Devops, typescripit
  • Seniority: Senior

Job description



Role: AI Evangelist (Hands-On)


A senior technical and advocacy role focused on bridging advanced AI technologies with practical business needs in a financial organisation. The position combines hands-on development with stakeholder education and strategic influence.


Core Responsibilities

  • Build and demonstrate AI-powered solutions for financial applications (e.g., trading, investment banking, insurance).
  • Translate complex AI concepts into business value for technical and non-technical stakeholders.
  • Conduct workshops and training to promote AI literacy and upskill teams.
  • Author technical blogs, white papers, and internal documentation.
  • Advise senior leadership on AI strategies and compliance.
  • Represent the organization at industry events and forums.
  • Ensure AI solutions meet regulatory and ethical standards.
  • Prototype and deploy AI models for forecasting, customer insights, underwriting, and fraud detection.



Technical Responsibilities

  • Design and manage agentic AI architectures and generative code systems.
  • Oversee code reviews and maintain standards for AI-assisted development.
  • Develop testing and validation protocols for AI-generated code.
  • Mentor technical teams and establish best practices.
  • Collaborate with cross-functional teams for safe AI adoption.



Qualifications

  • Degree in Computer Science, Data Science, Finance, or related field.
  • Strong programming skills (Python, SQL, C++, Java, etc.).
  • Experience with AI/ML frameworks (TensorFlow, PyTorch).
  • Minimum 4+ years in AI roles within finance or consulting.
  • Knowledge of AI ethics, compliance, and data privacy.
  • Excellent communication and stakeholder engagement skills.



Desired Traits

  • Passion for innovation in regulated environments.
  • Strong problem-solving and technical storytelling abilities.
  • Commitment to ethical AI deployment.



Key Skills

  • Advanced Python and backend languages.
  • Expertise in prompt engineering, fine-tuning, RAG, and agentic design.
  • Familiarity with AI tools (GitHub Copilot, ChatGPT, SonarQube, LangChain, etc.).
  • Understanding of observability, security, and compliance in AI systems.
  • Experience with containerization and cloud deployments.
5 Days on-site in the London Office (Blackfriars)
Inside IR35
12 month contract


Please send me a copy of your CV if you're interested


Freelance Azure Data Engineer / Architect

Finland, Helsinki

  • Negotiable
  • Engineer Role
  • Skills: Azure, Data, Databricks
  • Seniority: Mid-level

Job description



Job Description

My customer is seeking an experienced Freelance Azure Data Engineer with Data Architecture experience to design, build, and optimise enterprise-scale data platforms on Microsoft Azure.

This role is hands-on but requires strong architectural ownership - you'll be responsible for shaping data solutions end to end, from ingestion through to analytics and consumption.

Finnish-speaking is mandatory.



Key Responsibilities



  • Design and implement scalable Azure-based data architectures


  • Build and maintain reliable data pipelines (batch and streaming)


  • Lead architectural decisions across data ingestion, storage, and analytics


  • Develop data models optimised for performance and analytics use cases


  • Ensure best practices around data governance, security, and compliance


  • Optimise cost, performance, and reliability of Azure data platforms


  • Collaborate closely with engineers, analysts, and business stakeholders




Required Experience



  • Strong background as a Data Engineer with architect-level responsibility


  • Proven experience designing and delivering data platforms on Microsoft Azure


  • Hands-on experience with:


    • Azure Data Factory


    • Azure Synapse Analytics


    • Azure Databricks


    • Azure Data Lake (Gen2)


  • Strong SQL and data modelling expertise


  • Experience with CI/CD and infrastructure-as-code in Azure


  • Ability to work independently in a freelance/contract environment




Nice to Have



  • Streaming experience (Azure Event Hubs, Kafka)


  • Azure Purview / Microsoft Fabric experience


  • Terraform or Bicep


  • Experience in large enterprise or regulated environments