Intercom is the AI Customer Service company on a mission to help businesses provide incredible customer experiences. Our AI agent Fin, the most advanced customer service AI agent on the market, lets businesses deliver always-on, impeccable customer service and ultimately transform their customer experiences for the better.
Founded in 2011 and trusted by nearly 30,000 global businesses, Intercom is setting the new standard for customer service. Driven by our core values, we push boundaries, build with speed and intensity, and consistently deliver incredible value to our customers.
What's the opportunity?
The Data Infrastructure team builds distributed systems and tools supporting Intercom and empowering people with information. As the company grows, so does the volume and velocity of our data and the appetite for more-and-more sophisticated and specialized, often AI-assisted, data solutions.
* Our team builds, maintains, evolves, and extends the data platform, enabling our partners to self-serve by creating their own end-to-end data workflows, from ingestion through transforming data and evaluating experiments to analyzing usage and running predictive models.
* We provide a solid data foundation to support various highly impactful business and product-focused projects.
* We’re looking for a Data Infrastructure engineer to join us and collaborate on large-scale data-related infrastructure initiatives, who is passionate about providing solid foundations for providing high quality data to our consumers.
What will I be doing?
* Evolve the Data Platform by designing and building the next generation of the stack.
* Develop, run and support our data pipelines using tools like Airflow, PlanetScale, Kinesis, Snowflake, Tableau, all in AWS.
* Collaborate with product managers, data engineers, analysts and data scientists to develop tooling and infrastructure to support their needs.
* Develop automation and tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily.
* Implement systems to monitor our infrastructure, detect and surface data quality issues and ensure Operational Excellence.
Recent Projects The Team Has Delivered
* Migrating the MySQL ingestion pipeline from Aurora to PlanetScale.
* LLM utilisation DAG framework.
* Tableau Dashboard Performance monitoring.
* Unified Local Analytics Development Environment for Airflow and DBT.
* Containerised Snowflake Apps.
About You
* You have 3+ years of full-time, professional work experience in the data space using Python and SQL.
* You have solid experience building and running data pipelines for large and complex datasets including handling dependencies.
* You have hands-on cloud provider experience (preferably AWS) including service integrations and automation via CLI and APIs.
* You have a solid understanding of data security practices and are passionate about privacy.
* You have some DevOps experience.
* You care about your craft.
Bonus Experience
* Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows.
* Experience or understanding of tools and technologies included in the modern data stack (Snowflake, DBT).
* Industry awareness of up-and-coming technologies and vendors.
Benefits
* Competitive salary and equity in a fast-growing start-up.
* We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen.
* Regular compensation reviews - we reward great work!
* Pension scheme & match up to 4%.
* Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents.
* Open vacation policy and flexible holidays so you can take time off when you need it.
* Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones.
* If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too.
* MacBooks are our standard, but we also offer Windows for certain roles when needed.
Policies
Intercom has a hybrid working policy. We believe that working in person helps us stay connected, collaborate easier and create a great culture while still providing flexibility to work from home.
Intercom values diversity and is committed to a policy of Equal Employment Opportunity.
#J-18808-Ljbffr