Drive Data Integration and Innovation
Data Engineers play a pivotal role in shaping the future of healthcare by designing, developing, and implementing data integration solutions that empower informed decision-making. As a key member of our team, you will be responsible for creating workflows that support PCI automation objectives, exchanging data with external systems, and optimizing processes for maximum performance efficiency.
Key Responsibilities:
* Collaborate with subject matter experts to understand data requirements and design workflows that meet business needs.
* Develop processes to exchange data with external systems and components, ensuring seamless integration with ETL programs and front-end presentation layers.
* Optimize workflows, orchestration, and code for maximum performance efficiency.
* Utilize security best practices to ensure data and application security.
* Design and prototype data monitoring models for pipelines.
Requirements:
* Bachelor's Degree or higher in Computer Science, Information Technology, or related field.
* Extensive experience as a Data Integration Engineer or similar Data Engineering role.
* Hands-on experience with API development using Java, Python, and potentially Node.JS.
* Knowledge of data structures and database concepts.
* Experience with data processing and SQL databases.
* Advanced SQL experience.
* Experience with Relational Databases (RDBMS) and Non-Relational Databases.
* Experience in implementing ETL applications, Data Warehousing/Data Modelling principles, architecture, and its implementation in large environments.
* Working knowledge of public clouds like Azure, AWS, Google Cloud, and developing applications on Linux environments.
* Experience with Agile/Scrum methodologies.
Preferred Qualifications:
* Experience with Spark/Databricks in data processing and machine learning processes.
* Practical experience of developing solutions hosting within major cloud providers such as Azure, AWS, and Google Cloud.
* Experience with Airflow in implementing data pipeline orchestrations, Airflow operators, and hooks.
* Experience with Kubernetes and Docker, designing, implementing, and running data applications.
* Experience working in Data Warehouses.
* Experience with Apache Kafka.
* Knowledge of DevOps tools and practices.
* Agile development methodology.
* Experience of Kimball, Data Vault.
* Experience of Data Build Tool.
About Us:
We are committed to making healthcare work better for everyone. Our teams are at the forefront of building and adapting the latest technologies to propel healthcare forward. With a thriving ecosystem of investment and innovation, our business is constantly growing to support the healthcare needs of the future.
As a valued member of our team, you will have the opportunity to work with a diverse group of talented individuals who share a passion for innovation and excellence. We offer a range of benefits and opportunities for growth and development, including flexible work arrangements and the chance to split your monthly work hours between our Dublin or Letterkenny office and telecommuting from a home-based office.
Join Our Team:
We are an equal opportunities employer and welcome applications from qualified candidates who share our commitment to delivering equitable care and improving health outcomes. If you are passionate about data engineering and want to make a difference in the lives of others, we encourage you to apply.