Chevron is accepting online applications for the position
Data Architect - Data Lake & Data Engineering
through
April 3, 2026
at
11:59 p.m.
(Central Time).
Enable business opportunities through data availability and accessibility. Perform and/or coordinate end-to-end data lifecycle management activities from source to analytics, including movement, storage, modeling, enhancement, integration, quality, and security of data throughout the enterprise. Focus on data reusability, business outcome and cost efficiency.
Senior individual contributor architect accountable for hands on design, enablement, and production ownership of Chevron’s Enterprise Lakehouse architecture, implementing best practices in data engineering for AI-ready data, and architecture of enterprise AI platform. This role defines and operationalizes governed, scalable, and cost-efficient Lakehouse patterns primarily on Azure Databricks with Unity Catalog, while selectively evaluating and guiding usage of Microsoft Fabric for specific workloads. It aligns data engineering practices with the Lakehouse pattern and help architect the data AI stack using Databricks, Fabric and other tools.
This is a deeply technical architecture role balancing standards and strategy with direct implementation, validation, and enablement across data engineering teams. The role has no people management responsibilities; however, the role is responsible to mentor and guide other data/solution architects in the team.
Key Responsibilities
Core Deliverables
- Enterprise Lakehouse reference architectures and standards
- Production validated Unity Catalog patterns and guidance
- Documented evaluations and recommendations for Microsoft Fabric and Azure Databricks
- Reusable architecture artifacts that enable consistent, governed delivery
Enterprise Lakehouse Architecture & Ownership
- Define, evolve, and own the Enterprise Lakehouse architecture, with Azure Databricks as the primary data engineering platform and Microsoft Fabric evaluated for targeted workloads.
- Maintain hands-on ownership of architectural standards ensuring they are practical, enforceable, and proven at scale.
- Consult on design of scalable Lakehouse patterns supporting analytics, AI/ML, and application consumption.
Azure Databricks & Unity Catalog Enablement
- Lead adoption and operationalization of Unity Catalog, including:
- Catalog, schema, and storage location design
- Identity, access boundaries, and privilege models
- Data sharing, lineage, and governance alignment
- Define and validate Delta Lake/Delta table standards for performance, interoperability, and long-term maintainability.
- Provide hands-on guidance, examples, and enablement to data engineering teams using Spark, Delta, and Databricks SQL.
Microsoft Fabric
- Perform targeted architecture evaluations of Microsoft Fabric capabilities, including:
- OneLake architecture and domain organization patterns
- Consult on shortcut vs. mirroring approaches (tradeoffs, limitations, and governance impact)
- Security and governance alignment with Unity Catalog (duplication risks, access boundaries)
- Capacity planning and workload placement (Fabric capacities vs. Azure Databricks workloads)
- Interoperability patterns between Azure Databricks and Fabric
- Contribute to readiness assessments for Fabric features (e.g., Lakehouse, OneLake, RealTime Intelligence), with clear recommendations on appropriate use.
Architecture Standards, Governance & Operations
- Establish and enforce enterprise architecture standards aligned with security, compliance, and data governance policies.
- Influence CI/CD patterns for Lakehouse assets and architecture artifacts in Azure DevOps.
- Partner with platform and governance teams to ensure consistent data quality, lineage, observability, and cost optimization guardrails.
- Track and communicate architectural risks, priorities, and blockers impacting Enterprise Lakehouse adoption.
Cost, Performance & Optimization
- Workload placement decisions
- Storage and compute patterns
- Consumption and access models
- Validate performance and scalability of architecture through hands-on testing and tuning.
Technology Stack
- Azure Databricks (primary data engineering platform)
- Unity Catalog
- Delta Lake/Delta Tables
- Apache Spark (PySpark/Spark SQL)
- Microsoft Fabric/OneLake (architecture evaluation and selective usage)
- Azure Storage
- Azure DevOps (work item tracking; CI/CD concepts)
Stakeholders & Collaboration
- Product Manager – Data Lake & Data Engineering
- Data Engineering, Data Lake, and Unified Data Enablement teams
- Platform, security, and governance stakeholders
- Strategic technology vendors
Documentation, Mentorship & Collaboration
- Develop and maintain clear, consumable architecture artifacts, including logical and physical models, data flow diagrams, and reference patterns.
- Mentor data engineers and peer architects on Lakehouse standards, best practices, and platform usage.
- Collaborate closely with Product Management, Data Engineering, Data Lake, and Unified Data Enablement teams.
- Engage with vendors and internal stakeholders on architecture, governance, and platform direction.
Relocation Options:
Relocation
will not be
considered.
International Considerations
Expatriate assignments
will not be
considered.
Chevron regrets that it is unable to sponsor employment Visas or consider individuals on time-limited Visa status for this position.