Document
Status:
OPEN
Job ID:
30152
Title:
Cloud Engineer
Applications Invited from Countries:
United States
Job Description

Job title: Cloud Engineer - Data Ops Snowflake

Location: Denver, Colorado   

Duration: 12 Months

Job Description:

Security Clearance: OIT, FTI (IRS Pub 1075), and CJIS (Fingerprint-based)

1. Position Objective

The Office of Information Technology (OIT) is seeking a highly specialized Senior Operations and Data Engineer to serve as the primary administrator and technical lead for our Snowflake ecosystem. This role is a hybrid of platform operations and high-level data engineering, ensuring that sensitive state and federal data (FTI/CJIS) is managed within a secure, high-uptime, and cost-effective environment.

2. Preferred  Qualifications 

To be considered for this role, candidates should  provide proof of the following:

  • Active Snowflake Certification
  • Background Clearance Readiness: Absolute eligibility to pass OIT, FTI (Federal Tax Information), and CJIS (Criminal Justice Information Services) background checks.

 

3. Key Responsibilities

Platform Operations & Administration 

  • Snowflake Mastery: Act as the lead administrator for Snowflake environments; manage platform uptime, vendor escalations, and patch/versioning communications.
  • Environment Provisioning: Configure Snowflake, including complex RBAC (Role-Based Access Control) and security permissions.
  • Governance & CI/CD: Implement and manage DataOps and CI/CD pipelines to automate deployments for the broader implementation team.
  • Financial Stewardship: Configure cost-management features such as Snowflake resource monitors, budgets, and consumption tracking; consult on chargeback models.

Data Engineering & Transformation 

  • Pipeline Architecture: Develop robust ETL/ELT pipelines to ingest data from transactional systems (Line of Business) into the analytical Snowflake environment.
  • Analytical Modeling: Translate Data Architect visions into technical reality by building complex transformations and target schemas.
  • Quality Management: Design and deploy automated data cleansing and quality-check pipelines.
  • Performance Engineering: Optimize data flows for specific latency and frequency requirements while maintaining credit efficiency.

  

4. Primary Deliverables

  • Architectural Contributions: Design reviews, Architectural Plans, and Scope Documents.
  • Deployment Assets: New account/environment deployments, security schemas, and permission assignments.
  • Engineering Assets: Comprehensive ETL Pipeline Design Documents, Mapping Documents, and production-ready Pipelines.
  • Product Backlog & Support Ticket  Management; performance reports
  • Weekly Status Reports