Job Summary:
We are seeking a Senior Data Engineer to develop and maintain scalable, available, quality assured analytical building blocks/datasets by close coordination with data analysts.
The successful candidate will be responsible for maintaining/developing data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation, and loading of data from a wide variety of data sources using Python, SQL, DBT, and other data technologies.
Additionally, they will design, implement, test, and maintain data pipelines/new features based on stakeholders' requirements.
The ideal candidate will have experience with Snowflake, workflow management solutions like Airflow, and data transformations tools like DBT. They should also have knowledge of natural language processing (NLP) and computer vision techniques.
As a Senior Data Engineer, you will work closely with stakeholders to define data requirements and objectives. You will translate technical designs into business appropriate representations and analyze business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business.
Key Responsibilities:
* Maintain/develop data pipelines required for the extraction, transformation, cleaning, pre-processing, aggregation, and loading of data from a wide variety of data sources using Python, SQL, DBT, and other data technologies.
* Design, implement, test, and maintain data pipelines/new features based on stakeholders' requirements.
* Develop/maintain scalable, available, quality assured analytical building blocks/datasets by close coordination with data analysts.
* Collaborate with stakeholders to define data requirements and objectives.
Required Qualifications:
* 5+ Years Relevant Work Experience
* BA/BS in Data Science, Computer Science, Statistics, Mathematics, or a related field
* Built processes supporting data transformation, data structures, metadata, dependency, data quality, and workload management
* Experience with Snowflake, hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe. Must have worked on Snowflake Cost optimization scenarios.
* Overall solid programming skills, able to write modular, maintainable code, preferably Python & SQL
* Have experience with workflow management solutions like Airflow
* Have experience on Data transformations tools like DBT
* Experience working with Git
* Experience working with big data environment, like Hive, Spark, and Presto
Preferred Requirements:
* Experience supporting Support, Customer Success,
* DAG Airflows
* Knowledge of natural language processing (NLP) and computer vision techniques.
* Familiarity with version control systems (e.g., Git).
* Working knowledge of Power BI
* AWS environment, for example S3, Lambda, Glue, Cloud watch
* Basic understanding of Salesforce
* Experience working with remote teams spread across multiple time-zones