Overview
Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Join us to start Caring. Connecting. Growing together.
In healthcare, evolution doesn’t just happen. It takes innovation, imagination, and a passion for solving problems in new and better ways. And innovation is taking place at a lightning-fast pace every day at Optum. As the fastest growing part of the UnitedHealth Group family of businesses, we’re expanding our team in Ireland and creating opportunities for those who want greater purpose and more impact in their work.
You will provide the talent, ambition, and drive while we provide the investment, support, and resources to advance your career. You’ll be responsible for the development of complex data sources and pipelines into our data platform (Snowflake) along with other data applications (Azure, Airflow, etc.) and automation, working with the data, Architecture, Business Analyst, and Data Stewards to integrate requirements and constraints.
Careers with Optum offer flexible work arrangements. Individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny offices and a home-based telecommuting option in a hybrid model.
Primary responsibilities, qualifications, and additional notes are provided below.
Primary Responsibilities
* Integrate data from multiple on-premises and cloud sources and systems. Handle data ingestion, transformation, and consolidation to create a unified and reliable data foundation for analysis and reporting.
* Develop data transformation routines to clean, normalize, and aggregate data. Apply data processing techniques to handle complex data structures, missing data, and data quality for analysis, reporting, or machine learning tasks.
* Implement data de-identification/data masking in line with company standards.
* Monitor data pipelines and data systems to detect and resolve issues promptly.
* Develop monitoring tools to automate error handling mechanisms to ensure data integrity and system reliability.
* Utilize data quality tools like Great Expectations or Soda to ensure data accuracy and integrity throughout its lifecycle.
* Create and maintain data pipelines using Airflow and Snowflake as primary tools.
* Create SQL stored procedures to perform complex transformations.
* Understand data requirements and design optimal pipelines to fulfil use-cases.
* Create logical and physical data models to maintain data integrity.
* CI/CD pipeline creation and automation using Git and Git Actions.
* Tune and optimize data processes.
You will be rewarded and recognised for your performance in an environment that will challenge you and provide development opportunities.
Required Qualifications
* Bachelor’s degree in Computer Science or a related field.
* Hands-on experience as a Data Engineer.
* Proficiency in SQL (any flavor), with experience using window functions and advanced features.
* Excellent communication skills.
* Strong knowledge of Python.
* In-depth knowledge of Snowflake architecture, features, and best practices.
* Experience with CI/CD pipelines using Git and Git Actions.
* Knowledge of data modeling techniques, including Star Schema, Dimensional models, and Data Vault.
* Experience developing data pipelines (Snowflake) and writing complex SQL queries.
* Experience building ETL/ELT/data pipelines.
* Experience with related open-source platforms and languages (e.g., Scala, Python, Java, Linux).
* Experience with both relational and non-relational databases.
* Analytical and problem-solving skills applied to big data datasets.
* Experience with agile/scrum methodologies and high-performing teams.
* Understanding of access control, data masking, and row-level access policies.
* Exposure to DevOps methodology.
* Knowledge of data warehousing principles, architecture, and implementation.
Preferred Qualifications
* Bachelor’s degree or higher in Database Management, Information Technology, Computer Science, or a related field.
* Motivated self-starter who excels at independent task management and ownership.
* Experience orchestrating data tasks in Airflow to run on Kubernetes for data ingestion, processing, and cleaning.
* Expertise in designing and implementing data pipelines for high data volumes.
* Ability to create Docker images for applications to run on Kubernetes.
* Familiarity with Azure services such as Blob storage, Functions, Data Factory, Service Principal, Containers, Key Vault, etc.
Please note you must currently be eligible to work and remain indefinitely without any restrictions in the country to which you are applying. Proof may be required to support your application.
All telecommuters will be required to adhere to UnitedHealth Group’s Telecommuter Policy.
Optum is an Equal Employment Opportunity employer. We value diversity and are committed to creating an inclusive environment for all employees.
© 2025 Optum Services (Ireland) Limited. All rights reserved.
#J-18808-Ljbffr