Principal Data Engineer- Letterkenny or DublinMaximise your chances of a successful application to this job by ensuring your CV and skills are a good match.
Optum is a global organisation that delivers care, aided by technology to help millions of people live healthier lives.
The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best.
Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities.
Come make an impact on the communities we serve as you help us advance health equity on a global scale.
Join us to startCaring.
Connecting.
Growing together.
About the role:
At United Health Group and Optum, we want to make healthcare work better for everyone.
This depends on hiring the best and brightest.
With a thriving ecosystem of investment and innovation, our business in Ireland is constantly growing to support the healthcare needs of the future.
Our teams are at the forefront of building and adapting the latest technologies to propel healthcare forward in a way that better serves everyone.
With our hands at work across all aspects of health, we use the most advanced development tools, AI, data science and innovative approaches to make the healthcare system work better for everyone.
As a Principal Data Engineer at Optum, you'll be responsible for working with key business and technical partners to develop industry-leading data solutions that provide insights and analytics that drive efficiency and value to clients.
Careers with Optum offer flexible work arrangements and individuals who live and work in the Republic of Ireland will have the opportunity to split their monthly work hours between our Dublin or Letterkenny office and telecommuting from a home-based office.
Primary Responsibilities of the Principal Data Engineer:
Work collaboratively with business partners, SMEs, and developers to ensure a shared understanding of business and technical requirementsDesign and build data pipelines to process terabytes of dataDevelop and recommend best practice re data ingestion, processing, cleaning and standardizing of data (typically on Azure).Create Docker images for various applications and deployDesign and build testsTroubleshoot production issuesAnalyze existing Data solutions and recommend automation/efficiency optionsWork on Proof of Concepts for Big Data and Data ScienceDemonstrate superior communication and presentation capabilities, adept at simplifying complex data insights for audiences without a technical background.Serves as a leader/ mentor.Implement data de-identification/data masking in line with company standards.Creating logical & physical data models to ensure data integrity is maintainedCI CD pipeline creation & automation using GIT & GIT ActionsTuning and optimizing data processesYou will be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role, as well as providing development for other roles you may be interested in.
Required Qualifications of the Principal Data Engineer:
Bachelor's degree in computer science or a related field or equivalent experienceAnalytical and problem-solving skills applied to big data datasetsEssential to showcase ETL strategy, including solution design, constraints, loading methods, security, and governanceGood interpersonal skills to engage and communicate effectively with customers and audiences of different backgrounds within the organizationIn-depth knowledge of Snowflake architecture, features, and best practices with advanced SQL skills to support data warehouse operations and write efficient codeAzure Cloud: Experience using ADF to design pipelines to ingest data from on-prem into Snowflake.
Must have experience with self-hosted integration runtime, linked services, and Azure Dev Ops for CI/CD processesGit Hub: Experience with actions and CI/CD, branch strategyData Modelling: Extensive experience designing dimensional models based on business use cases and reporting needsExperience orchestrating data tasks in Airflow