Document
Status:
CLOSED
Job ID:
30193
Title:
Data Engineer
Applications Invited from Countries:
United States
Job Description

Job title: Data Engineer

Location: San Jose, CA - Remote

Duration: 12 months

Description:

  • The Data Operations team builds capabilities and owns practices that enable seamless data usage across the company. The team collaborates with stakeholders to understand requirements, develop scalable solutions, solicit feedback, deploy models, and ensure high-quality data products.
  • Engineers on the Data Operations team bring a pragmatic approach to building robust, efficient, and well-documented data models. They balance speed with accuracy, ensuring that our data infrastructure evolves to meet the needs of our growing business.

 

Responsibilities:

  • You will be responsible for supporting the migration of dbt models from Redshift and Snowflake to Trino (Iceberg) while ensuring compatibility, performance, and best practices in our evolving infrastructure.
  • Your role will require a deep understanding of SQL dialects, data modeling principles, and dbt best practices.
  • You will collaborate with both engineering and analytics teams to ensure seamless model transitions and maintain quality standards.

 

Specifically, you will:

  • Fix Redshift and Snowflake SQL Syntax to Trino Syntax in our dbt models.
  • Version control dbt models via Git.
  • Refactor and optimize data models, ensuring they align with best practices for Iceberg.
  • Work with Tableau assets, ensuring data sources align with the new infrastructure.
  • Collaborate with workbook owners to validate and test the impact of migration work.
  • Engage in daily standups and work alongside engineers to execute migration priorities efficiently.

 

Qualifications:

  • BS/BA in Computer Science, Information Systems, Mathematics, or a related technical field (or equivalent work experience).
  • 2+ years of experience in a data engineering, BI, or analytics role.
  • 1+ years of experience working in financial services and/or SaaS companies.
  • Experience in dbt and modern data modeling best practices.
  • Version controlling via Git is a must
  • Experience working in startup environments is a plus.

 

Competencies (Attributes Necessary for Success in this Role):

  • Strong proficiency in SQL, with experience in Multiple SQL Syntaxes (Redshift, Snowflake, and Trino (Iceberg).
  • Deep understanding of dbt, including model development, macros, and Jinja templating.
  • Experience with BI tools, such as Tableau, and a working knowledge of data source construction.
  • Strong stakeholder communication skills, particularly in guiding teams through infrastructure changes.
  • Experience refactoring SQL and dbt models to support infrastructure migrations.

 

Must Have:

  • Bachelor’s degree in Computer Science, Information Systems, Mathematics, or related field (or equivalent experience)
  • Minimum 2+ years of experience in data engineering, BI, or analytics roles
  • At least 1 year of experience in financial services and/or SaaS companies
  • Hands-on experience working with dbt, including data modeling and best practices
  • Strong SQL proficiency across Redshift, Snowflake, and Trino (Iceberg)
  • Experience version controlling data models using Git

 

Nice to Have:

  • Experience working in startup environments