Role
Senior Technology Architect
Technology – Azure, ADF, Databricks, Pyspark
Location – Cork, Ireland
Compensation – Competitive (including bonus)
Job Description
In the role of Senior Technology Architect, you will interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Requirements Elicitation, Application Architecture definition and Design. You will play an important role in creating the Architecture and high level design artifacts. You will also do technical feasibility and identify smart ways of completing client’s requirement. You will guide the team technically and work closely with offshore team acting as a communication medium between the client and offshore. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.
Key Responsibilities
Own end-to-end solution architecture for large-scale data platforms on Azure, with primary focus on Azure Databricks and Lakehouse patterns.
Define architecture across data ingestion, transformation, orchestration, and consumption layers using ADLS Gen2, Azure Data Factory, Azure Databricks (Delta/Lakehouse).
Drive technical solutioning for data migration and modernization, including mapping/compatibility assessment and scalable transformation design.
Create, review, and govern High-Level Design (HLD) and Low-Level Design (LLD) including non-functional requirements (NFRs): security, performance, scalability, reliability, and cost.
Provide deep technical guidance on PySpark transformations, SparkSQL patterns, and scalable ETL/ELT design for large datasets.
Establish best practices for Delta Lake / Medallion architecture (Bronze/Silver/Gold), data quality controls, and audit/reconciliation patterns.
Define CI/CD and engineering governance for data pipelines (coding standards, branching strategy, release management, automated testing).
Lead architecture workshops, solution walkthroughs, and technical governance with client architects and key stakeholders.
Convert business requirements into scalable platform capabilities and a pragmatic delivery roadmap.
Own technical risk management—identify architectural risks early, define mitigation plans, and drive resolution.
Work closely with Data Engineering, BI/Analytics, QA, and DevOps teams to ensure architecture is implemented correctly.
Perform design/code reviews and ensure adherence to architecture, security, and quality standards.
Required
Deep expertise in Azure Databricks and distributed processing using PySpark / SparkSQL
Strong understanding of Lakehouse / Delta Lake implementation patterns
Strong experience with ADLS Gen2, Azure Data Factory, and enterprise‑scale pipeline orchestration
Strong architecture experience for data platforms, including NFR ownership and governance
Preferred
Strong recent experience in Azure Data Platform architecture, especially Azure Databricks / Lakehouse.
Proven ability to own end‑to‑end solution architecture for large‑scale data platforms, including target architecture, HLD/LLD, and NFRs (security, performance, scalability, reliability, cost).
Strong hands‑on expertise in PySpark, SparkSQL, and ETL/ELT engineering, with experience designing scalable ingestion and transformation patterns.
Strong stakeholder and delivery leadership—able to lead design workshops, govern engineering standards, and partner closely with client and delivery teams to ensure successful implementation.
Personal
High analytical skills
A high degree of initiative and flexibility
High customer orientation
High quality awareness
Excellent verbal and written communication skills
Equal Opportunity Employer
Infosys is a global leader in next‑generation digital services and consulting. All aspects of employment at Infosys are based on merit, competence and performance. We are committed to embracing diversity and creating an inclusive environment for all employees. Infosys is proud to be an equal opportunity employer.
#J-18808-Ljbffr