Job Summary
The ideal candidate for this role will possess a strong foundation in data engineering, with expertise in building complex data infrastructure and leading technical initiatives.
* Develop scalable and efficient data pipelines (ETL/ELT) for real-time and batch processing.
* Integrate diverse data sources, implement fault-tolerant systems, and establish robust CI/CD practices.
* Design and implement secure, accessible data lake/warehouse solutions for large datasets.
* Collaborate with the team to monitor pipeline performance, troubleshoot issues, and implement observability and alerting systems.
* Apply GenAI solutions to enhance team productivity and efficiency.
* Ensure adherence to governance policies by documenting systems and processes.
* Mentor junior engineers and drive innovation in data infrastructure.