We are seeking a high-caliber Senior Data Engineer to join a premier global financial services firm. In this role, you will play a pivotal part in evolving our enterprise-wide Centralized Data Platform, built on the Databricks Lakehouse architecture. You will collaborate with data scientists and analysts to transform complex financial data into high-value insights, leveraging AI-enhanced development and cutting-edge cloud infrastructure to drive technical excellence. Key Responsibilities Lakehouse Architecture: Design and implement scalable Databricks solutions for enterprise-level data processing and advanced analytics. Pipeline Engineering: Build, optimize, and maintain robust ETL/ELT processes, including Delta Live Tables for seamless ingestion and transformation. Streaming & Real-Time: Create and manage structured streaming pipelines to facilitate real-time data delivery. Governance & Security: Implement Unity Catalog features and IAM best practices to ensure rigorous security and access control. Optimization: Fine-tune Databricks clusters and Spark jobs to achieve maximum performance and cost efficiency. DevOps Integration: Support infrastructure-as-code efforts using Terraform and participate in an Agile/Scrum environment. Quality Assurance: Implement monitoring frameworks for pipeline health and data integrity while contributing to rigorous code reviews. Technical Requirements Experience: 6+ years in Data Engineering, with at least 2+ years of dedicated hands-on experience with the Databricks platform. Core Languages: Mastery of Python, SQL, and Spark programming. Cloud Infrastructure: Proven experience within AWS environments (S3, Glue, Lambda). Modern Data Stack: Deep understanding of Delta Lake, Lakehouse architecture, and sophisticated data modeling. Next-Gen Development: Practical experience incorporating AI tools into the software development lifecycle to boost productivity. Version Control: Proficiency with Git-based workflows and CI/CD principles. Desirable Attributes Background in the Financial Services or Fintech industry. Experience with API development and real-time processing frameworks. Familiarity with Data Governance frameworks and cross-cloud implementations. Ability to mentor junior engineers and document complex technical systems clearly. Technical Environment Primary Platform: Databricks (Lakehouse, Unity Catalog, Delta Live Tables) Cloud: AWS Tools: Terraform, Git, Python, SQL Innovation Focus: AI/ML implementation patterns and real-time integrations
TPBN1_IJ