Senior Data Engineer

Posted 2026-05-06
Remote, USA Full-time Immediate Start




This is a remote position.



Job Title: Senior Data Engineer
Experience: 7–9 Years
Location: Remote
Notice Period: Immediate Joiners Only





About the Engagement

We are looking for experienced Senior Data Engineers to join a large-scale, multi-year Data Mapping and DataOps Platform modernisation programme for a major government ministry.

This strategic initiative focuses on transforming how data is managed, governed, and leveraged as a core enterprise asset.

Key Focus Areas



  • Data Engineering: Designing, building, and optimising scalable data pipelines to enhance stability, traceability, and refresh frequency

  • Data Mapping & Metadata Capture: Onboarding critical financial systems into a centralised enterprise data catalogue and supporting governance initiatives

Work will be delivered in an Agile/Scrum model, collaborating closely with data product managers and cross-functional teams.

Platform Overview

The platform is built on Microsoft Azure and Microsoft Fabric, with an existing in-house Finance Data Catalogue (FDC).

This role will contribute to scaling the platform into a production-grade DataOps ecosystem.

Key Responsibilities



  • Design, build, optimise, and maintain scalable data pipelines

  • Develop and manage ELT pipelines, orchestration, and automation using Python

  • Capture and onboard metadata into enterprise data catalogues

  • Perform data analysis to identify patterns, anomalies, and quality improvements

  • Contribute to data product design, including data models and structures

  • Design and implement data models for analytics and reporting

  • Build and maintain CI/CD pipelines for automated deployment

  • Develop automation scripts using Python, Bash, and PowerShell

  • Develop and maintain RESTful APIs for integration and interoperability

  • Implement data quality checks and validation frameworks

  • Set up monitoring, alerting, and performance tracking

  • Develop dashboards and reporting solutions using Microsoft Fabric & Power BI

  • Collaborate with stakeholders to translate business needs into technical solutions

  • Ensure compliance with data governance, privacy, and security standards

  • Maintain technical documentation and communicate with both technical and business teams

Mandatory Requirements

Candidates must have recent (last 48 months) hands-on production experience in:



  • Python (data pipelines, scripting, automation)

  • SQL (data transformation, modelling)

  • Microsoft Azure & Microsoft Fabric

  • ELT Pipeline Development

  • Git (version control & collaboration)

  • Metadata Management & Data Catalogue onboarding

  • Data Product Design

  • Agile/Scrum methodologies

  • Data Analysis & Data Quality Assessment

Technical Skills Required

Cloud & Platform



  • Microsoft Azure

  • Microsoft Fabric

Data Engineering



  • Python, SQL

  • Bash, PowerShell

Pipelines & DevOps



  • ELT pipelines

  • CI/CD pipelines

Data Modelling



  • Schema design

  • Metadata management

  • Data product design

API Development



  • RESTful APIs

Analytics & Reporting



  • Power BI

  • Microsoft Fabric Reporting

Version Control



  • Git, GitHub

Methodology



  • Agile / Scrum

 Nice to Have



  • Advanced Power BI (DAX, dashboarding, modelling)

  • Data visualization beyond standard reporting

  • Experience with REST API integrations

  • Exposure to:


    • Azure Synapse Analytics

    • Azure Data Lake

    • Azure DevOps


  • Experience in government/public sector or regulated environments

  • Familiarity with data governance frameworks

  • Experience in financial systems modernisation programs

 








Similar Jobs

Back to Job Board