Skip to content

Analytics Engineer II

Oregon State University Foundation LogoOregon State University Foundation

Location

Corvallis, OR

Posted

Yesterday

Reporting to the Senior Director II, Relationship Insights & Analytics, the Analytics Engineer II designs, builds, and maintains the Foundation’s analytics infrastructure and applications. This role ensures a reliable, scalable analytics environment that delivers meaningful value to end users, with a strong emphasis on an AI‑ready architecture optimized for advanced analytics and use by analytical AI agents.

Measures of success

  • Reliability: Data is delivered on time and consistently, with minimal disruptions that impact end users.
  • Analytics readiness: Analytics infrastructure and platforms are well‑designed, accessible, and easy to use, enabling teams to answer questions quickly with fewer data issues or downtime.
  • Impact: Infrastructure supports meaningful business decisions and AI initiatives, demonstrated by increased use of trusted datasets and semantic layers, faster delivery of insights, and improved analytics.

Major duties

  • Design, build, and maintain SQL/dbt data models and semantic layers optimized for analytical query patterns and AI agent access.
  • Define and maintain semantic layer encoding business logic, KPIs, and metric definitions in a single authoritative location.
  • Apply dimensional modeling techniques to accurately represent business processes and metrics in our data warehouse.
  • Prepare and curate data assets optimized for machine learning, LLM, and AI‑agent workflows.
  • Partner with analysts, data scientists, and stakeholders to translate business requirements into trusted data/analytical products.
  • Develop and optimize ELT/ETL pipelines in a cloud data warehouse (e.g., Snowflake) for reliable, scalable data delivery.
  • Implement data quality testing (schema, freshness, business rules) and remediate recurring data issues.
  • Establish and sustain data lineage standards across analytical models, enabling stakeholders to trace data provenance from source to consumption in support of data governance, auditability, and operational reliability.
  • Contribute to shared analytics codebases via Git/GitHub, including code reviews and established development workflows.
  • Support Python-based automation, lightweight analytics services, and troubleshooting of cloud or containerized components as needed.

Knowledge/skills

Core analytics engineering skills

  • Solid experience in data modeling, including dimensional modeling concepts and data warehouse design, and leveraging AI tools such as Claude Code.
  • Strong SQL proficiency for transformation, analysis, and validation.
  • Python proficiency for data processing, automation, and integration.
  • Experience with dbt or a comparable analytics engineering framework to build, test, and document data models.

Data platforms and infrastructure

  • Experience in Azure cloud environments (AWS or GCP transferable).
  • Experience with Snowflake or other modern cloud data warehouses (BigQuery, Redshift, Databricks SQL, etc.).
  • Familiarity with relational databases such as SQL Server and PostgreSQL.

Data platforms and orchestration

  • Experience designing, building, and maintaining ELT/ETL pipelines.
  • Familiarity with orchestration tools (Airflow, Prefect, Azure Data Factory, or similar).
  • Ability to troubleshoot pipeline failures, performance issues, and data quality problems.
  • Experience implementing data quality tests, including schema, freshness, and business rule validations.

Version control and development practices

  • Experience with Git/GitHub for version control and shared codebase collaboration.
  • Comfortable with following established development workflows, including code reviews and iterative improvement.

Additional knowledge/skills

  • Experience designing data models and semantic layers that support machine‑learning and LLM‑based reasoning for leveraging AI‑driven analytical agents.
  • Experience with Tableau or similar BI tools (Power BI, Looker, etc.) to support dashboard development and analytics.
  • Familiarity with Python web application development using frameworks such as Flask.
  • Understanding of web application concepts, including routing, authentication patterns, and environment configuration.
  • Exposure to Docker or containerized applications, including interpreting logs and basic runtime troubleshooting.

Education/experience

Bachelor’s degree in computer science or related field and three to five years of software development or related experience. Preference will be given to those with experience or knowledge in AI‑related processes and technologies, including Python, DBT, Azure, SQL Server, and Snowflake. Education and experience equivalencies will be considered.

Important organizational notes

The candidate must possess a valid driver’s license and be prepared to work a flexible schedule that may include evenings and weekends to support foundation events and donor‑related activities. Adherence to a professional code of conduct is essential at all times, ensuring the highest standards of integrity, respect, and ethical behavior in all interactions and responsibilities. Adherence to the core values is vital to success.