Job Summary
\
We are seeking a skilled Data Engineer to join our team. This role involves designing and developing data processing systems, ensuring maximum performance efficiency.
\
Key Responsibilities:
\
\
* Collaborate with subject matter experts to understand data requirements and create workflows that support automation objectives
\
* Design and develop processes to exchange data with external systems and components for ETL programs and front-end presentation layer
\
* Optimize workflows, orchestration, and code for maximum performance efficiency
\
* Utilize security best practices to ensure data and application security
\
* Design conceptual and logical data models and flowcharts
\
* Develop data monitoring models for pipelines
\
* Write technical documentation
\
* Mainly work in a collaborative environment with experienced engineers and participate in the development of high-quality software solutions using the latest technologies and methodologies
\
About You:
\
\
* Bachelor's degree or higher in Computer Science, Information Technology, or related field OR extensive experience as a Data Integration Engineer
\
* Extensive experience as a Data Integration engineer or similar Data Engineering role
\
* Hands-on experience with API development using Java, Python, potentially Node.js
\
* Knowledge of data structures and database concepts
\
* Experience with data processing and SQL databases
\
* Advanced SQL experience
\
* Experience with Relational Databases (RDBMS) and Non-Relational Databases
\
* Experience in implementing ETL applications, data warehousing/data modeling principles, architecture, and implementation in large environments
\
* Working knowledge of public clouds like Azure, AWS, Google Cloud and developing applications on Linux environments
\
* Experience with agile/Scrum methodologies
\
* Practical experience in developing solutions hosted within key major cloud providers such as Azure, AWS, and Google Cloud
\
* Experience working with Airflow in implementing data pipeline orchestrations, Airflow operators, and hooks
\
* Experience working with Kubernetes and Docker, designing, implementing, and running data applications
\
* Experience working in data warehouses
\
* Experience with Apache Kafka
\
* Knowledge of DevOps tools and practices
\
* Agile development methodology
\
* Experience of Kimball, Data Vault
\
* Experience of Data Build Tool