Are you ready to shape the future of AI-powered decision-making? At, we're building the intelligence layer for the world's most complex industries. Our Objective AI (ODAI) platform combines physics with advanced AI to deliver actionable insights—transforming how global operations make decisions.
We're looking for an experienced Senior Software/Data Engineer to architect and scale our real-time data systems. This is a hands-on technical role with the opportunity to influence system design, technical direction, and best practices across the company. You'll work at the intersection of data engineering, software development, and AI, delivering high-impact solutions that directly power our customers' operations.
What You'll Be Working On
* Architecting & Scaling: Design and evolve high-performance, fault-tolerant data pipelines (streaming + batch) that process massive real-world data flows.
* Real-Time Systems: Build cutting-edge event-driven applications using Apache Flink and modern stream processing frameworks.
* Cross-Functional Collaboration: Partner with Data Scientists, Product Engineers, and DevOps to design solutions that bring AI insights to life.
* Technical Leadership: Influence data architecture standards, mentor colleagues, and champion best practices in monitoring, observability, and governance.
* Operational Excellence: Lead improvements in data quality, schema evolution, recovery logic, and CI/CD for production-grade deployments.
What We're Looking For
* Proven experience in Software/Data Engineering with a strong focus on real-time and distributed data systems.
* Deep knowledge of Apache Flink (event time, watermarks, windowing, state management).
* Strong coding ability in Java and (Python a plus).
* Solid grounding in object-oriented and functional programming paradigms.
* Advanced SQL skills and comfort working with both relational and NoSQL databases.
* Hands-on experience with message queues (e.g., NATS, Pub/Sub, Kafka).
* Cloud-native mindset with experience in GCP, AWS, or Azure plus Docker/Kubernetes.
* Understanding of DevOps principles: CI/CD, automated testing, infrastructure as code.
* Strong communication skills, able to engage with both technical and non-technical stakeholders.
* Degree in Computer Science or related field—or equivalent professional expertise.
Bonus Points For
* Exposure to other data frameworks (Kafka Streams, Apache Beam, Spark Structured Streaming).
* Experience with modern data warehouses (BigQuery, Redshift, Snowflake).
* Familiarity with Helm, Istio, Jenkins, Terraform.
* Monitoring/observability with Prometheus, Grafana.
* Awareness of GDPR/CCPA and data compliance standards.
Why Join
* Impact at Scale: Your work will directly power critical decision-making in industries that move the world.
* Cutting-Edge Tech: Build with the latest in real-time data, AI, and cloud-native engineering.
* Growth & Leadership: Take ownership, shape technical direction, and mentor within a global, high-performing team.
* Flexibility: Fully remote role (Ireland-based) with flexible working hours.
* Inclusive Culture: We value diverse perspectives and believe the best ideas come from collaboration across backgrounds and experiences.
Location: Remote, Ireland-based only (applications from outside Ireland cannot be considered).
Apply Today Join us in building the intelligence layer for the future of operations. You belong here. Let's build something extraordinary together.
Job Type: Full-time
Benefits:
* Flexitime
* Sick pay
* Unlimited paid holidays
* Work from home
Work authorisation:
* Ireland (required)
Work Location: Remote