Overview Location: Dublin 2. Other locations: Primary Location Only. Date: Oct 2, 2025. Requisition ID:. At EY, we're all in to shape your future with confidence. We'll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. FS Technology Consulting - AI and Data - Data Engineer - Senior Consultant / Manager - Dublin/Cork/Limerick/Galway General Information Location: Dublin, Cork, Limerick, or Galway. Available for VISA Sponsorship: No Business Area: Data & Analytics Contract Type: Full Time - Permanent EY is the only major professional services firm with an integrated Financial Services practice across Europe, the Middle East, India, and Africa (EMEIA). We connect our Asset Management, Banking and Capital Markets and Insurance clients to 6,500 talented people from 12 countries and 35,000 Financial Services colleagues around the world. In EY FS Ireland, we are expanding our Consulting practice and building our Technology Consulting team as a key component of our All In Strategy. All In is EY's global strategy and ambition to create long-term value to all stakeholders by leveraging data and technology. It is designed to integrate transformative solutions across EY's services, ensuring a comprehensive approach to our clients' needs. We are actively recruiting high achieving individuals with technical data expertise within Financial Services: Banking, Insurance and Wealth and Asset Management. We are seeking an experienced Data Engineer with a strong background in supporting business and analytics teams in Financial Services industry by designing, developing, and operating robust data pipelines. The ideal candidate will have experience in working with Azure Cloud and proficiency in using Microsoft Azure Data Services and platforms such as Azure SQL, Data Warehouse, Data Factory, Data Lakes, HDInsight, and Stream Analytics. Core Azure SQL, MSSQL server. They will also have understanding of the underlying data flow architectures, will have supported finance, operations, privacy & security, compliance & regulatory projects in the past, have a functional understanding of the data sets used in the relevant business domain(s) they supported, as well be adept at data sourcing, data wrangling, issue gap and root cause analysis, ETL and implementing data solutions for effective process change and improvement strategies. Experience in agile software delivery is a must. Your Key Responsibilities:Design, implement, and manage robust and scalable data pipelines and solutions for core systems and applications.Collaborate with IT and business teams to understand business and technical requirements in order to identify, prioritise and document data requirements for management and business insights.Design, build, and maintain data solutions that transform raw data into structured formats suitable for analytics.Develop and manage data models and ETL processes to ensure effective and efficient data ingestion and transformation.Analyse and conduct gap analysis in existing data architectures to identify impact areas and design solution options.Conduct thorough analysis of existing data processes and systems, identifying areas for improvement. Work closely with Data Stakeholders (e.g. Product and System Owners, Technical Leads, Business and Change Leads, Data and Solution Architects) to communicate and influence implementation of the identified technical enhancements.Lead performance optimisation identification of data pipeline bottlenecks and areas for improvement. Optimise query performance in data storage solutions and implement best practices for data processes and transformations to enhance the efficiency of ETL workflows.Implement security measures to protect sensitive data throughout the data lifecycle, including encryption, access controls, and data masking.Ensure compliance with data governance policies and regulatory requirements, such as GDPR, BCBS239, IFRS9, IRB or HIPAA.Monitor data quality and integrity throughout the transformation process, implementing controls and fixes as needed and maintaining compliance with the organisation's Data Governance and Data Management policies and procedures.Design data architecture that can scale horizontally and vertically to accommodate growing data volumes and user demands.Support the design and implementation of integrated technology solutions with data expertise, ensuring alignment with business goals and compliance with internal and external policies, procedures, and regulations.Design and implement robust reporting frameworks that facilitate timely and accurate financial status reporting Skills and Attributes for Success To qualify for the role, you must have: Experience in one or more of the following Cloud Platforms:Core Azure SQL and MSSQL serverDeep understanding of Azure-native services for data engineering.Experience with Azure Key Vault, Azure Monitor, and RBAC. AWS:Experience with S3, Glue, Redshift, and Lambda.Familiarity with IAM, CloudWatch, and Athena.Experience with BigQuery, Cloud Storage, and Dataflow.Familiarity with Cloud Functions, IAM, and Vertex AI (optional). Experience in one or more of the following Data Platforms: Microsoft:Experience with Azure Data Factory for orchestrating data workflows.Proficiency in Azure Synapse Analytics and SQL Server.Familiarity with Azure Data Lake Storage (Gen2) and Azure Blob Storage.Knowledge of Power BI integration and data modellingUnderstanding of Azure Functions and Logic Apps for automation. Snowflake:Strong SQL skills and experience with Snowflake's architecture (virtual warehouses, storage, cloud services).Proficiency in Snowflake Streams & Tasks for CDC and automation.Experience with Snowflake Secure Data Sharing and Snowflake Marketplace.Familiarity with Snowpark for Python/Java-based transformations.Understanding of role-based access control, data masking, and time travel features. Databricks:Hands-on experience with Apache Spark and Databricks Runtime.Proficiency in Delta Lake for ACID-compliant data lakes.Experience with Structured Streaming and Auto Loader.Familiarity with MLflow, Feature Store, and Model Registry.Use of Databricks notebooks for collaborative development in Python, SQL, or Scala. Successful applicants should also possess:Bachelors' degree in Data Science, Analytics, Information Technology, Computer Science, Statistics, Mathematics, Quantitative Economics, Engineering, or equivalent professional education.Minimum of 3 years' experience in a Data Engineering role with a blue-chip consulting firm or in the Data Office/Data Engineering function of a multinational organisation or a large financial services institution.Proficient in SQL at an advanced level, with experience in complex data manipulation and analysis including advanced joins, subqueries, and performance tuningExperience in Continuous Integration & Continuous Delivery (CI/CD) and relevant tools, e.g. Azure DevOps/ Azure Pipelines, GitHub, Jenkins, Terraform, Ansible, CircleCI.Familiarity with one or more business domains: finance, operations, privacy & security, regulatory & compliance.Knowledge of typical data management and governance frameworks and compliance requirements.Experience with Agile methodologies and a proven track record of delivering data solutions in an Agile environment.Excellent analytical, problem-solving, and decision-making skills.Relevant work experience in multi-stakeholder environment. Strong communication and interpersonal skills, with the ability to engage effectively with both technical and non-technical stakeholders.Team centric mindset, always ready support their colleagues and not afraid to ask for help when needed.Understanding and respect for client and team deadlines, great organisation skills and ability to organise and prioritise own work to achieve the delivery objectives.Familiarity with Python and the most widely used Python libraires as well as datatypes such as: Dictionaries, Lists, Arrays, Strings Ideally, you will also have:Experience working with Financial Services clients (Banking, Insurance or Wealth and Asset Management) is preferred.Ability to deal with ambiguity and uncertainty.Growth and continuous learning mindset.Be able to conduct workshops and stakeholder interviews.Ability to build and manage relationships across business and technology stakeholders.Financial Services industry background or experience as a bonus.Preparation and delivery of MI/ BI reporting in Power BI or SAS a bonus.Snowflake experience a bonus. What we look forSomeone who is passionate about reaching their full potential and excelling in their careerSomeone with energy, enthusiasm and courage who enjoys solving complex problems and variety in their day-to-day working lifeSomeone who enjoys working as part of a community which values integrity, respect, teaming, and inclusiveness What working at EY offers We offer a competitive remuneration package. click apply for full job details