Job Opportunity We are seeking a skilled and experienced Data Pipeline Developer to join our team. * Design and implement scalable ETL/ELT pipelines using Azure Data Factory (ADF), including parameterised pipelines and mapping dataflows. * Integrate with Microsoft Dataverse and Dynamics 365 using ADFs native connectors, OData endpoints, and REST APIs. * Work with cross-functional teams to gather integration requirements and deliver robust solutions. This role will focus on building scalable data pipelines, developing API integrations, and ensuring seamless data synchronisation with Microsoft Dynamics 365. Key Responsibilities include: * Developing and optimising transformation logic, including column reduction, lookups, and upsert operations. * Implementing delta load strategies and batch endpoints for managing high-volume data synchronisation. * Ensuring data consistency and integrity across systems using unique identifiers and business keys. * Collaborating with business analysts and architects to translate business requirements into technical solutions. The ideal candidate will have a strong understanding of the following technologies: * ADF Expertise: Proficient in building pipelines, triggers, and mapping dataflows; skilled in parameterisation, error handling, and orchestration. * OData & API Integration: Strong understanding of the OData protocol and experience integrating with REST APIs. * Dataverse & Dynamics 365: Hands-on experience with Dataverse schema, connector configuration, and entity model mapping. Requirements: Azure Data Engineering Big Data Responsibilities