At iCIMS, we're redefining how people connect with opportunity through intelligent, human-centred technology. We're growing rapidly and are seeking Data Engineers at multiple levels of experience - from early career to experienced professionals - to build the next generation of our Talent Cloud platform through scalable data pipelines, storage systems, and analytics solutions that power our data-driven decision-making and AI capabilities.
You'll design, build, and optimise data infrastructure that supports analytics, business intelligence, and product development. You'll collaborate with software engineers, data scientists, and product experts in a culture that values innovation, ownership, and continuous learning.
This is a hybrid position based in Dublin city centre, our strategic hub for AI development in Ireland.
Responsibilities
* Design, develop, and maintain scalable data pipelines to collect, process, and store data from multiple sources
* Build and optimise data infrastructure to support analytics, reporting, and AI/ML workloads
* Implement event sourcing and streaming architectures (e.g., Kafka, AWS Kinesis) for autonomous agents and reactive systems
* Apply security-by-design principles, data governance, and best practices to all data solutions, ensuring compliance with enterprise standards and regulatory requirements
* Collaborate with engineering, product, and business stakeholders to deliver reliable data solutions
* Troubleshoot and resolve data-related issues whilst ensuring data quality and integrity
* Stay up to date with the latest industry trends and technologies to drive innovation within the team
* Contribute to best practices, frameworks, and tools for data engineering excellence
* For senior-level candidates: mentor junior engineers and lead technical initiatives
Qualifications
We're hiring at all experience levels and will match responsibilities to your background.
* Bachelor's degree in Computer Science, Engineering, or related field (or equivalent professional experience)
* Experience building andmaintaininglarge-scale data pipelines and systems
* Proficiencyin Python; familiarity with Java
* Strong SQL skills and experience with relational and non-relational databases (e.g., SQL Server, PostgreSQL, MySQL, MongoDB)
* Hands-on experience with cloud platforms (AWS preferred) and services like S3, Redshift, orBigQuery
* Experience with streaming platforms (Kafka, AWS Kinesis) and event-driven architectures
* Understanding of data modelling, warehousing, and schema design principles
* Familiarity with data transformation tools (e.g.,dbt), BI platforms (e.g., Looker, Tableau), and API development for data consumption
* Knowledge of version control (Git), CI/CD pipelines, and security principles for data systems (encryption, IAM, compliance frameworks)
* Experience with user behaviour tracking platforms (e.g.,Snowplow, Google Analytics) is a plus
* Strong analytical and problem-solving skills with intellectual curiosity
* Strong communicationand collaboration skills across both technical and non-technical teams
* For senior-level candidates:demonstratedexperience in mentoring, leading projects, or driving strategic data initiatives
#J-18808-Ljbffr