
While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth.
If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi!
About Quantiphi
Quantiphi is an award-winning Applied AI and Big Data software and services company, driven by a deep desire to solve transformational problems at the heart of businesses. Our signature approach combines groundbreaking machine-learning research with disciplined cloud and data-engineering practices to create breakthrough impact at unprecedented speed.
Quantiphi Has Seen 2.5x Growth YoY Since Its Inception In 2013, We Don’t Just Innovate - We Lead. Headquartered In Boston, With 4,000+ Quantiphi Professionals Across The Globe. As An Elite/Premier Partner For Google Cloud, AWS, NVIDIA, Snowflake, And Others, We’ve Been Recognized With
Be part of a trailblazing team that’s shaping the future of AI, ML, and cloud innovation. Your next big opportunity starts here!
For more details, visit: Website or LinkedIn Page.
Experience Level: 10 + years of experience
Work Location: Anywhere in Canada
Role Summary
We are seeking a Data Modeler experienced in designing scalable and performant data
warehouse models on Google BigQuery. This role will focus on translating insurance
business processes — sourced from Guidewire systems (PolicyCenter, BillingCenter,
ClaimsCenter) — into robust Enterprise Data Models (EDM) and subject area designs.
The Data Modeler will collaborate closely with Business Analysts, Data Engineers, and the
GCP Data Architect to ensure that the warehouse schema supports both operational reporting
and analytical workloads.
Key Responsibilities
● Design and maintain conceptual, logical, and physical data models for Guidewire
source domains (Policy, Claims, Billing).
● Develop BigQuery-optimized schemas (partitioning, clustering, denormalization
strategies) to improve cost efficiency and query performance.
● Collaborate with Business Analysts to translate functional requirements into data
structures and relationships that reflect business logic.
● Define and manage data dictionaries, entity definitions, and relationships across
different Guidewire modules.
● Partner with Data Engineers to ensure model alignment with ingestion pipelines and
transformation logic.
● Develop and maintain model metadata, versioning, and lineage documentation.
● Apply insurance-specific best practices in modeling for coverage hierarchy, policy
transactions, claims exposures, and financial entities.
● Conduct data profiling and analysis to validate source data integrity and identify
transformation needs.
● Act as a data steward, promoting consistency, standardization, and governance across
all layers (staging, operational, analytical).
● Provide support for impact assessments when new Lines of Business or fields are
introduced in Guidewire.
Required Skills and Experience
● 6+ years of experience in data modeling for data warehouses or analytics platforms.
● Strong hands-on experience with BigQuery (DDL scripting, modeling for performance,
clustering/partitioning).
● Experience modeling data from Guidewire systems (PolicyCenter, BillingCenter, or
ClaimsCenter) or other P&C insurance platforms.
● Proficiency with data modeling tools such as ER/Studio, ERwin, or SQL Power
Architect.
● Deep understanding of data modeling principles (3NF vs denormalized reporting
models, star/snowflake schema).
● Experience in defining business glossaries, data dictionaries, and master entity
models.
● Good understanding of SQL for validation, data profiling, and schema management.
● Excellent analytical mindset, attention to detail, and cross-functional collaboration
abilities.
Preferred Qualifications
● Prior experience with insurance data modernization or Guidewire
DataHub/InfoCenter models.
● Exposure to metadata management, data cataloging, and governance tools (e.g.,
Collibra, Data Catalog).
● Familiarity with Python, dbt, or Dataform for model automation documentation
(optional).
● Google Cloud certification (Professional Data Engineer) or equivalent.
Why Join
● Be the modeling backbone of an enterprise Guidewire to GCP modernization
program.
● Define the blueprint for how Policy, Claims, and Billing data will be analyzed
organization-wide.
● Work closely with technical (Data Engineers) and functional (BAs) teams to ensure
business-aligned, technically efficient data architecture.
● Influence enterprise data standards and governance practices in a cloud-native
environment.