Data Engineer
Noibu
Noibu is the leading ecommerce analytics & monitoring platform, purpose-built to help retailers protect and grow online revenue. By unifying site monitoring, experience analytics, and conversion growth opportunities in a single pane of glass, Noibu captures the most important end-to-end shopping data, without the complexity of traditional analytics tools.
Noibu surfaces critical site errors, performance issues, and customer journey friction that block conversions, then ties every insight directly to business impact, session replays, and full technical context. This makes it easy for ecommerce teams to understand why things are happening and what to prioritize, without dedicated analytics headcount.
The result: faster decisions, better collaboration across teams, optimized customer experiences, and revenue growth.
Learn more about Noibu at www.noibu.com.
What You'll Do
- Design, build, and maintain ETL/ELT pipelines that unify data from Salesforce, Gong, Segment, Mixpanel, ChurnZero, and product databases into a structured, reliable data warehouse.
- Own the full software development lifecycle of ETL pipelines, including the design, development, deployment and maintenance of your code.
- Own the data modeling layer (staging, marts, dimensional models) using tools such as dbt, ensuring data is organized and queryable for analytics across the business.
- Build and maintain data quality checks, monitoring, and alerting to ensure reliability and trust in company-wide data.
- Collaborate closely with R&D and product teams to define data contracts and ensure product telemetry and event tracking are structured for downstream analytics.
- Establish and enforce data governance standards, ensuring consistency and integrity across data sources.
- Partner with stakeholders across Product, Customer Success, RevOps, Marketing, and Finance to understand business needs and ensure the data platform supports decision-making across the customer lifecycle.
- Enable teams across Noibu by creating a unified data foundation that makes customer, product, and revenue data easily accessible and reliable.
- Support downstream analytics and reporting by providing clean, well-modeled datasets that power dashboards, experimentation, and product insights.
What You Bring
- Strong experience designing and maintaining production-grade data pipelines in a modern data stack.
- Hands-on experience with cloud data warehouses such as Snowflake, BigQuery, or Redshift.
- Strong proficiency with SQL and at least one programming language (e.g., Python, Scala).
- Deep experience with data transformation frameworks (dbt) and modern data modeling practices (e.g., Kimball dimensional modeling).
- Experience with data orchestration tools such as Airflow or Dagster to manage and monitor pipelines.
- Experience integrating and unifying data from multiple SaaS platforms and product data sources.
- Experience working in SaaS or technology environments, especially with product analytics and customer lifecycle data.
Who You Are
- You have 4+ years of experience building data infrastructure in a high-growth startup or fast-moving technology environment.
- You enjoy building the foundations that power analytics, rather than focusing only on dashboards and reporting.
- You are comfortable working embedded with technical teams (R&D) while partnering with business stakeholders across the company.
- You are excited to help architect and scale a company-wide data platform that supports product, customer, and revenue insights.
- You care deeply about data quality, reliability, and long-term scalability.
- You take ownership of problems end-to-end and thrive in ambiguous, fast-evolving environments.
- You are a strong communicator that is able to take semi-structured requirements and transform them into actionable steps. You are also able to communicate analytical findings to non-technical stakeholders.
90000 - 130000 CAD a year